Key ingredients for effective zero-shot cross-lingual knowledge transfer in generative tasks

Nadezhda Chirkova, Vassilina Nikoulina


Abstract
Zero-shot cross-lingual transfer, which implies finetuning of the multilingual pretrained language model on input-output pairs in one language and using it to make task predictions for inputs in other languages, was widely studied for natural language understanding but is understudied for generation. Previous works notice a frequent problem of generation in a wrong language and propose approaches to address it, usually using mT5 as a backbone model. In this work we compare various approaches proposed from the literature in unified settings, also including alternative backbone models, namely mBART and NLLB-200. We first underline the importance of tuning learning rate used for finetuning, which helps to substantially alleviate the problem of generation in the wrong language. Then, we show that with careful learning rate tuning, the simple full finetuning of the model acts as a very strong baseline and alternative approaches bring only marginal improvements. Finally, we find that mBART performs similarly to mT5 of the same size, and NLLB-200 can be competitive in some cases. Our final zero-shot models reach the performance of the approach based on data translation which is usually considered as an upper baseline for zero-shot cross-lingual transfer in generation.
Anthology ID:
2024.naacl-long.401
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7215–7231
Language:
URL:
https://aclanthology.org/2024.naacl-long.401
DOI:
Bibkey:
Cite (ACL):
Nadezhda Chirkova and Vassilina Nikoulina. 2024. Key ingredients for effective zero-shot cross-lingual knowledge transfer in generative tasks. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 7215–7231, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Key ingredients for effective zero-shot cross-lingual knowledge transfer in generative tasks (Chirkova & Nikoulina, NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.401.pdf
Copyright:
 2024.naacl-long.401.copyright.pdf