PermuteFormer: Efficient Relative Position Encoding for Long Sequences

Peng Chen


Abstract
A recent variation of Transformer, Performer, scales Transformer to longer sequences with a linear attention mechanism. However, it is not compatible with relative position encoding, which has advantages over absolute position encoding. In this paper, we discuss possible ways to add relative position encoding to Performer. Based on the analysis, we propose PermuteFormer, a Performer-based model with relative position encoding that scales linearly on long sequences. PermuteFormer applies position-dependent transformation on queries and keys to encode positional information into the attention module. This transformation is carefully crafted so that the final output of self-attention is not affected by absolute positions of tokens. PermuteFormer introduces negligible computational overhead by design that it runs as fast as Performer. We evaluate PermuteFormer on Long-Range Arena, a dataset for long sequences, as well as WikiText-103, a language modeling dataset. The experiments show that PermuteFormer uniformly improves the performance of Performer with almost no computational overhead and outperforms vanilla Transformer on most of the tasks.
Anthology ID:
2021.emnlp-main.828
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10606–10618
Language:
URL:
https://aclanthology.org/2021.emnlp-main.828
DOI:
10.18653/v1/2021.emnlp-main.828
Bibkey:
Cite (ACL):
Peng Chen. 2021. PermuteFormer: Efficient Relative Position Encoding for Long Sequences. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 10606–10618, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
PermuteFormer: Efficient Relative Position Encoding for Long Sequences (Chen, EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.828.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.828.mp4
Code
 cpcp1998/permuteformer
Data
WikiText-103WikiText-2