Isotropy, Clusters, and Classifiers

Timothee Mickus, Stig-Arne Grönroos, Joseph Attieh


Abstract
Whether embedding spaces use all their dimensions equally, i.e., whether they are isotropic, has been a recent subject of discussion. Evidence has been accrued both for and against enforcing isotropy in embedding spaces. In the present paper, we stress that isotropy imposes requirements on the embedding space that are not compatible with the presence of clusters—which also negatively impacts linear classification objectives. We demonstrate this fact both empirically and mathematically and use it to shed light on previous results from the literature.
Anthology ID:
2024.acl-short.7
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
75–84
Language:
URL:
https://aclanthology.org/2024.acl-short.7
DOI:
10.18653/v1/2024.acl-short.7
Bibkey:
Cite (ACL):
Timothee Mickus, Stig-Arne Grönroos, and Joseph Attieh. 2024. Isotropy, Clusters, and Classifiers. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 75–84, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Isotropy, Clusters, and Classifiers (Mickus et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-short.7.pdf