-former: Infinite Memory Transformer

Pedro Henrique Martins, Zita Marinho, Andre Martins


Abstract
Transformers are unable to model long-term memories effectively, since the amount of computation they need to perform grows with the context length. While variations of efficient transformers have been proposed, they all have a finite memory capacity and are forced to drop old information. In this paper, we propose the -former, which extends the vanilla transformer with an unbounded long-term memory. By making use of a continuous-space attention mechanism to attend over the long-term memory, the -former’s attention complexity becomes independent of the context length, trading off memory length with precision.In order to control where precision is more important, -former maintains “sticky memories,” being able to model arbitrarily long contexts while keeping the computation budget fixed.Experiments on a synthetic sorting task, language modeling, and document grounded dialogue generation demonstrate the -former’s ability to retain information from long sequences.
Anthology ID:
2022.acl-long.375
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5468–5485
Language:
URL:
https://aclanthology.org/2022.acl-long.375
DOI:
10.18653/v1/2022.acl-long.375
Bibkey:
Cite (ACL):
Pedro Henrique Martins, Zita Marinho, and Andre Martins. 2022. ∞-former: Infinite Memory Transformer. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5468–5485, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
∞-former: Infinite Memory Transformer (Martins et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.375.pdf
Video:
 https://aclanthology.org/2022.acl-long.375.mp4
Code
 deep-spin/infinite-former
Data
PG-19WikiText-103WikiText-2