Anisotropy is Not Inherent to Transformers

Anemily Machina, Robert Mercer


Abstract
Isotropy is the property that embeddings are uniformly distributed around the origin. Previous work has shown that Transformer embedding spaces are anisotropic, which is called the representation degradation problem. This degradation has been assumed to be inherent to the standard language modeling tasks and to apply to all Transformer models regardless of their architecture. In this work we identify a set of Transformer models with isotropic embedding spaces, the large Pythia models. We examine the isotropy of Pythia models and explore how isotropy and anisotropy develop as a model is trained. We find that anisotropic models do not develop as previously theorized, using our own analysis to show that the large Pythia models optimize their final Layer Norm for isotropy, and provide reasoning why previous theoretical justifications for anisotropy were insufficient. The identification of a set of isotropic Transformer models calls previous assumptions into question, provides a set of models to contrast existing analysis, and should lead to deeper insight into isotropy.
Anthology ID:
2024.naacl-long.274
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4892–4907
Language:
URL:
https://aclanthology.org/2024.naacl-long.274
DOI:
Bibkey:
Cite (ACL):
Anemily Machina and Robert Mercer. 2024. Anisotropy is Not Inherent to Transformers. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 4892–4907, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Anisotropy is Not Inherent to Transformers (Machina & Mercer, NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.274.pdf
Copyright:
 2024.naacl-long.274.copyright.pdf