Improving In-context Learning of Multilingual Generative Language Models with Cross-lingual Alignment

Chong Li, Shaonan Wang, Jiajun Zhang, Chengqing Zong


Abstract
Multilingual generative models obtain remarkable cross-lingual in-context learning capabilities through pre-training on large-scale corpora. However, they still exhibit a performance bias toward high-resource languages and learn isolated distributions of multilingual sentence representations, which may hinder knowledge transfer across languages. To bridge this gap, we propose a simple yet effective cross-lingual alignment framework exploiting pairs of translation sentences. It aligns the internal sentence representations across different languages via multilingual contrastive learning and aligns outputs by following cross-lingual instructions in the target language. Experimental results show that even with less than 0.1\textperthousand of pre-training tokens, our alignment framework significantly boosts the cross-lingual abilities of generative language models and mitigates the performance gap. Further analyses reveal that it results in a better internal multilingual representation distribution of multilingual models.
Anthology ID:
2024.naacl-long.445
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8058–8076
Language:
URL:
https://aclanthology.org/2024.naacl-long.445
DOI:
10.18653/v1/2024.naacl-long.445
Bibkey:
Cite (ACL):
Chong Li, Shaonan Wang, Jiajun Zhang, and Chengqing Zong. 2024. Improving In-context Learning of Multilingual Generative Language Models with Cross-lingual Alignment. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 8058–8076, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Improving In-context Learning of Multilingual Generative Language Models with Cross-lingual Alignment (Li et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.445.pdf