Analysis of Multi-Source Language Training in Cross-Lingual Transfer

Seonghoon Lim, Taejun Yun, Jinhyeon Kim, Jihun Choi, Taeuk Kim


Abstract
The successful adaptation of multilingual language models (LMs) to a specific language-task pair critically depends on the availability of data tailored for that condition. While cross-lingual transfer (XLT) methods have contributed to addressing this data scarcity problem, there still exists ongoing debate about the mechanisms behind their effectiveness.In this work, we focus on one of promising assumptions about inner workings of XLT, that it encourages multilingual LMs to place greater emphasis on language-agnostic or task-specific features. We test this hypothesis by examining how the patterns of XLT change with a varying number of source languages involved in the process.Our experimental findings show that the use of multiple source languages in XLT-a technique we term Multi-Source Language Training (MSLT)-leads to increased mingling of embedding spaces for different languages, supporting the claim that XLT benefits from making use of language-independent information. On the other hand, we discover that using an arbitrary combination of source languages does not always guarantee better performance. We suggest simple heuristics for identifying effective language combinations for MSLT and empirically prove its effectiveness.
Anthology ID:
2024.acl-long.42
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
712–725
Language:
URL:
https://aclanthology.org/2024.acl-long.42
DOI:
Bibkey:
Cite (ACL):
Seonghoon Lim, Taejun Yun, Jinhyeon Kim, Jihun Choi, and Taeuk Kim. 2024. Analysis of Multi-Source Language Training in Cross-Lingual Transfer. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 712–725, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Analysis of Multi-Source Language Training in Cross-Lingual Transfer (Lim et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.42.pdf