The Impact of Language Adapters in Cross-Lingual Transfer for NLU

Jenny Kunz, Oskar Holmström


Abstract
Modular deep learning has been proposed for the efficient adaption of pre-trained models to new tasks, domains and languages. In particular, combining language adapters with task adapters has shown potential where no supervised data exists for a language. In this paper, we explore the role of language adapters in zero-shot cross-lingual transfer for natural language understanding (NLU) benchmarks. We study the effect of including a target-language adapter in detailed ablation studies with two multilingual models and three multilingual datasets. Our results show that the effect of target-language adapters is highly inconsistent across tasks, languages and models. Retaining the source-language adapter instead often leads to an equivalent, and sometimes to a better, performance. Removing the language adapter after training has only a weak negative effect, indicating that the language adapters do not have a strong impact on the predictions.
Anthology ID:
2024.moomin-1.4
Volume:
Proceedings of the 1st Workshop on Modular and Open Multilingual NLP (MOOMIN 2024)
Month:
March
Year:
2024
Address:
St Julians, Malta
Editors:
Raúl Vázquez, Timothee Mickus, Jörg Tiedemann, Ivan Vulić, Ahmet Üstün
Venues:
MOOMIN | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–43
Language:
URL:
https://aclanthology.org/2024.moomin-1.4
DOI:
Bibkey:
Cite (ACL):
Jenny Kunz and Oskar Holmström. 2024. The Impact of Language Adapters in Cross-Lingual Transfer for NLU. In Proceedings of the 1st Workshop on Modular and Open Multilingual NLP (MOOMIN 2024), pages 24–43, St Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
The Impact of Language Adapters in Cross-Lingual Transfer for NLU (Kunz & Holmström, MOOMIN-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.moomin-1.4.pdf