Language Contamination Helps Explains the Cross-lingual Capabilities of English Pretrained Models

Terra Blevins, Luke Zettlemoyer


Abstract
English pretrained language models, which make up the backbone of many modern NLP systems, require huge amounts of unlabeled training data. These models are generally presented as being trained only on English text but have been found to transfer surprisingly well to other languages. We investigate this phenomenon and find that common English pretraining corpora actually contain significant amounts of non-English text: even when less than 1% of data is not English (well within the error rate of strong language classifiers), this leads to hundreds of millions of foreign language tokens in large-scale datasets. We then demonstrate that even these small percentages of non-English data facilitate cross-lingual transfer for models trained on them, with target language performance strongly correlated to the amount of in-language data seen during pretraining. In light of these findings, we argue that no model is truly monolingual when pretrained at scale, which should be considered when evaluating cross-lingual transfer.
Anthology ID:
2022.emnlp-main.233
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3563–3574
Language:
URL:
https://aclanthology.org/2022.emnlp-main.233
DOI:
10.18653/v1/2022.emnlp-main.233
Bibkey:
Cite (ACL):
Terra Blevins and Luke Zettlemoyer. 2022. Language Contamination Helps Explains the Cross-lingual Capabilities of English Pretrained Models. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 3563–3574, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Language Contamination Helps Explains the Cross-lingual Capabilities of English Pretrained Models (Blevins & Zettlemoyer, EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.233.pdf