Multilingual BERT has an Accent: Evaluating English Influences on Fluency in Multilingual Models

Isabel Papadimitriou, Kezia Lopez, Dan Jurafsky


Abstract
While multilingual language models can improve NLP performance on low-resource languages by leveraging higher-resource languages, they also reduce average performance on all languages (the ‘curse of multilinguality’). Here we show another problem with multilingual models: grammatical structures in higher-resource languages bleed into lower-resource languages, a phenomenon we call grammatical structure bias. We show this bias via a novel method for comparing the fluency of multilingual models to the fluency of monolingual Spanish and Greek models: testing their preference for two carefully-chosen variable grammatical structures (optional pronoun-drop in Spanish and optional Subject-Verb ordering in Greek). We find that multilingual BERT is biased toward the English-like setting (explicit pronouns and Subject-Verb-Object ordering) and against the default Spanish and Gerek settings, as compared to our monolingual control language model. With our case studies, we hope to bring to light the fine-grained ways in which multilingual models can be biased, and encourage more linguistically-aware fluency evaluation.
Anthology ID:
2023.sigtyp-1.16
Volume:
Proceedings of the 5th Workshop on Research in Computational Linguistic Typology and Multilingual NLP
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Lisa Beinborn, Koustava Goswami, Saliha Muradoğlu, Alexey Sorokin, Ritesh Kumar, Andreas Shcherbakov, Edoardo M. Ponti, Ryan Cotterell, Ekaterina Vylomova
Venue:
SIGTYP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
143–146
Language:
URL:
https://aclanthology.org/2023.sigtyp-1.16
DOI:
10.18653/v1/2023.sigtyp-1.16
Bibkey:
Cite (ACL):
Isabel Papadimitriou, Kezia Lopez, and Dan Jurafsky. 2023. Multilingual BERT has an Accent: Evaluating English Influences on Fluency in Multilingual Models. In Proceedings of the 5th Workshop on Research in Computational Linguistic Typology and Multilingual NLP, pages 143–146, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Multilingual BERT has an Accent: Evaluating English Influences on Fluency in Multilingual Models (Papadimitriou et al., SIGTYP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.sigtyp-1.16.pdf