Latent Positional Information is in the Self-Attention Variance of Transformer Language Models Without Positional Embeddings

Ta-Chung Chi, Ting-Han Fan, Li-Wei Chen, Alexander Rudnicky, Peter Ramadge


Abstract
The use of positional embeddings in transformer language models is widely accepted. However, recent research has called into question the necessity of such embeddings. We further extend this inquiry by demonstrating that a randomly initialized and frozen transformer language model, devoid of positional embeddings, inherently encodes strong positional information through the shrinkage of self-attention variance. To quantify this variance, we derive the underlying distribution of each step within a transformer layer. Through empirical validation using a fully pretrained model, we show that the variance shrinkage effect still persists after extensive gradient updates. Our findings serve to justify the decision to discard positional embeddings and thus facilitate more efficient pretraining of transformer language models.
Anthology ID:
2023.acl-short.102
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1183–1193
Language:
URL:
https://aclanthology.org/2023.acl-short.102
DOI:
10.18653/v1/2023.acl-short.102
Bibkey:
Cite (ACL):
Ta-Chung Chi, Ting-Han Fan, Li-Wei Chen, Alexander Rudnicky, and Peter Ramadge. 2023. Latent Positional Information is in the Self-Attention Variance of Transformer Language Models Without Positional Embeddings. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1183–1193, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Latent Positional Information is in the Self-Attention Variance of Transformer Language Models Without Positional Embeddings (Chi et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.102.pdf