%0 Conference Proceedings %T ReadTwice: Reading Very Large Documents with Memories %A Zemlyanskiy, Yury %A Ainslie, Joshua %A de Jong, Michiel %A Pham, Philip %A Eckstein, Ilya %A Sha, Fei %Y Toutanova, Kristina %Y Rumshisky, Anna %Y Zettlemoyer, Luke %Y Hakkani-Tur, Dilek %Y Beltagy, Iz %Y Bethard, Steven %Y Cotterell, Ryan %Y Chakraborty, Tanmoy %Y Zhou, Yichao %S Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies %D 2021 %8 June %I Association for Computational Linguistics %C Online %F zemlyanskiy-etal-2021-readtwice %X Knowledge-intensive tasks such as question answering often require assimilating information from different sections of large inputs such as books or article collections. We propose ReadTwice, a simple and effective technique that combines several strengths of prior approaches to model long-range dependencies with Transformers. The main idea is to read text in small segments, in parallel, summarizing each segment into a memory table to be used in a second read of the text. We show that the method outperforms models of comparable size on several question answering (QA) datasets and sets a new state of the art on the challenging NarrativeQA task, with questions about entire books. %R 10.18653/v1/2021.naacl-main.408 %U https://aclanthology.org/2021.naacl-main.408 %U https://doi.org/10.18653/v1/2021.naacl-main.408 %P 5189-5195