An Isotropy Analysis in the Multilingual BERT Embedding Space

Sara Rajaee, Mohammad Taher Pilehvar


Abstract
Several studies have explored various advantages of multilingual pre-trained models (such as multilingual BERT) in capturing shared linguistic knowledge. However, less attention has been paid to their limitations. In this paper, we investigate the multilingual BERT for two known issues of the monolingual models: anisotropic embedding space and outlier dimensions. We show that, unlike its monolingual counterpart, the multilingual BERT model exhibits no outlier dimension in its representations while it has a highly anisotropic space. There are a few dimensions in the monolingual BERT with high contributions to the anisotropic distribution. However, we observe no such dimensions in the multilingual BERT. Furthermore, our experimental results demonstrate that increasing the isotropy of multilingual space can significantly improve its representation power and performance, similarly to what had been observed for monolingual CWRs on semantic similarity tasks. Our analysis indicates that, despite having different degenerated directions, the embedding spaces in various languages tend to be partially similar with respect to their structures.
Anthology ID:
2022.findings-acl.103
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1309–1316
Language:
URL:
https://aclanthology.org/2022.findings-acl.103
DOI:
10.18653/v1/2022.findings-acl.103
Bibkey:
Cite (ACL):
Sara Rajaee and Mohammad Taher Pilehvar. 2022. An Isotropy Analysis in the Multilingual BERT Embedding Space. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1309–1316, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
An Isotropy Analysis in the Multilingual BERT Embedding Space (Rajaee & Pilehvar, Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.103.pdf
Code
 sara-rajaee/multilingual-isotropy