An Efficient Approach for Studying Cross-Lingual Transfer in Multilingual Language Models

Fahim Faisal, Antonios Anastasopoulos


Abstract
The capacity and effectiveness of pre-trained multilingual models (MLMs) for zero-shot cross-lingual transfer is well established. However, phenomena of positive or negative transfer, and the effect of language choice still need to be fully understood, especially in the complex setting of massively multilingual LMs. We propose an efficient method to study transfer language influence in zero-shot performance on another target language. Unlike previous work, our approach disentangles downstream tasks from language, using dedicated adapter units. Our findings suggest that some languages do not largely affect others, while some languages, especially ones unseen during pre-training, can be extremely beneficial or detrimental for different target languages. We find that no transfer language is beneficial for all target languages. We do, curiously, observe languages previously unseen by MLMs consistently benefit from transfer from almost any language. We additionally use our modular approach to quantify negative interference efficiently and categorize languages accordingly. Furthermore, we provide a list of promising transfer-target language configurations that consistently lead to target language performance improvements.
Anthology ID:
2024.mrl-1.4
Volume:
Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024)
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Jonne Sälevä, Abraham Owodunni
Venue:
MRL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
45–92
Language:
URL:
https://aclanthology.org/2024.mrl-1.4
DOI:
Bibkey:
Cite (ACL):
Fahim Faisal and Antonios Anastasopoulos. 2024. An Efficient Approach for Studying Cross-Lingual Transfer in Multilingual Language Models. In Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024), pages 45–92, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
An Efficient Approach for Studying Cross-Lingual Transfer in Multilingual Language Models (Faisal & Anastasopoulos, MRL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.mrl-1.4.pdf