InteMATs: Integrating Granularity-Specific Multilingual Adapters for Cross-Lingual Transfer

Meizhen Liu, Xu Guo, He Jiakai, Jianye Chen, Fengyu Zhou, Siu Hui


Abstract
Multilingual language models (MLLMs) have achieved remarkable success in various cross-lingual transfer tasks. However, they suffer poor performance in zero-shot low-resource languages, particularly when dealing with longer contexts. Existing research mainly relies on full-model fine-tuning on large parallel datasets to enhance the cross-lingual alignment of MLLMs, which is computationally expensive. In this paper, we propose InteMATs, a novel approach that integrates multilingual adapters trained on texts of different levels of granularity. To achieve this, we curate a multilingual parallel dataset comprising 42 languages to pre-train sentence-level and document-level adapters under the contrastive learning framework. Extensive experiments demonstrate the effectiveness of InteMATs in improving the cross-lingual transfer performance of MLLMs, especially on low-resource languages. Finally, our comprehensive analyses and ablation studies provide a deep understanding of the high-quality representations derived by InteMATs.
Anthology ID:
2023.findings-emnlp.335
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5035–5049
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.335
DOI:
10.18653/v1/2023.findings-emnlp.335
Bibkey:
Cite (ACL):
Meizhen Liu, Xu Guo, He Jiakai, Jianye Chen, Fengyu Zhou, and Siu Hui. 2023. InteMATs: Integrating Granularity-Specific Multilingual Adapters for Cross-Lingual Transfer. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 5035–5049, Singapore. Association for Computational Linguistics.
Cite (Informal):
InteMATs: Integrating Granularity-Specific Multilingual Adapters for Cross-Lingual Transfer (Liu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.335.pdf