PreAlign: Boosting Cross-Lingual Transfer by Early Establishment of Multilingual Alignment

Jiahuan Li, Shujian Huang, Aarron Ching, Xinyu Dai, Jiajun Chen


Abstract
Large language models demonstrate reasonable multilingual abilities, despite predominantly English-centric pretraining. However, the spontaneous multilingual alignment in these models is shown to be weak, leading to unsatisfactory cross-lingual transfer and knowledge sharing. Previous works attempt to address this issue by explicitly injecting multilingual alignment information during or after pretraining. Thus for the early stage in pretraining, the alignment is weak for sharing information or knowledge across languages. In this paper, we propose PreAlign, a framework that establishes multilingual alignment prior to language model pretraining. PreAlign injects multilingual alignment by initializing the model to generate similar representations of aligned words and preserves this alignment using a code-switching strategy during pretraining. Extensive experiments in a synthetic English to English-Clone setting demonstrate that PreAlign significantly outperforms standard multilingual joint training in language modeling, zero-shot cross-lingual transfer, and cross-lingual knowledge application. Further experiments in real-world scenarios further validate PreAlign’s effectiveness across various model sizes.
Anthology ID:
2024.emnlp-main.572
Original:
2024.emnlp-main.572v1
Version 2:
2024.emnlp-main.572v2
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10246–10257
Language:
URL:
https://aclanthology.org/2024.emnlp-main.572/
DOI:
10.18653/v1/2024.emnlp-main.572
Bibkey:
Cite (ACL):
Jiahuan Li, Shujian Huang, Aarron Ching, Xinyu Dai, and Jiajun Chen. 2024. PreAlign: Boosting Cross-Lingual Transfer by Early Establishment of Multilingual Alignment. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 10246–10257, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
PreAlign: Boosting Cross-Lingual Transfer by Early Establishment of Multilingual Alignment (Li et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.572.pdf