ReadOnce Transformers: Reusable Representations of Text for Transformers

Shih-Ting Lin, Ashish Sabharwal, Tushar Khot


Abstract
We present ReadOnce Transformers, an approach to convert a transformer-based model into one that can build an information-capturing, task-independent, and compressed representation of text. The resulting representation is reusable across different examples and tasks, thereby requiring a document shared across many examples or tasks to only be read once. This leads to faster training and evaluation of models. Additionally, we extend standard text-to-text transformer models to Representation+Text-to-text models, and evaluate on multiple downstream tasks: multi-hop QA, abstractive QA, and long-document summarization. Our one-time computed representation results in a 2x-5x speedup compared to standard text-to-text models, while the compression also allows existing language models to handle longer documents without the need for designing new pre-trained models.
Anthology ID:
2021.acl-long.554
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7129–7141
Language:
URL:
https://aclanthology.org/2021.acl-long.554
DOI:
10.18653/v1/2021.acl-long.554
Bibkey:
Cite (ACL):
Shih-Ting Lin, Ashish Sabharwal, and Tushar Khot. 2021. ReadOnce Transformers: Reusable Representations of Text for Transformers. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 7129–7141, Online. Association for Computational Linguistics.
Cite (Informal):
ReadOnce Transformers: Reusable Representations of Text for Transformers (Lin et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.554.pdf
Video:
 https://aclanthology.org/2021.acl-long.554.mp4
Data
HotpotQANarrativeQASQuAD