Exploring Intrinsic Language-specific Subspaces in Fine-tuning Multilingual Neural Machine Translation

Zhe Cao, Zhi Qu, Hidetaka Kamigaito, Taro Watanabe


Abstract
Multilingual neural machine translation models support fine-tuning hundreds of languages simultaneously. However, fine-tuning on full parameters solely is inefficient potentially leading to negative interactions among languages. In this work, we demonstrate that the fine-tuning for a language occurs in its intrinsic language-specific subspace with a tiny fraction of entire parameters. Thus, we propose language-specific LoRA to isolate intrinsic language-specific subspaces. Furthermore, we propose architecture learning techniques and introduce a gradual pruning schedule during fine-tuning to exhaustively explore the optimal setting and the minimal intrinsic subspaces for each language, resulting in a lightweight yet effective fine-tuning procedure. The experimental results on a 12-language subset and a 30-language subset of FLORES-101 show that our methods not only outperform full-parameter fine-tuning up to 2.25 spBLEU scores but also reduce trainable parameters to 0.4% for high and medium-resource languages and 1.6% for low-resource ones.
Anthology ID:
2024.emnlp-main.1177
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21142–21157
Language:
URL:
https://aclanthology.org/2024.emnlp-main.1177
DOI:
Bibkey:
Cite (ACL):
Zhe Cao, Zhi Qu, Hidetaka Kamigaito, and Taro Watanabe. 2024. Exploring Intrinsic Language-specific Subspaces in Fine-tuning Multilingual Neural Machine Translation. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 21142–21157, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Exploring Intrinsic Language-specific Subspaces in Fine-tuning Multilingual Neural Machine Translation (Cao et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.1177.pdf