A Novel Two-step Fine-tuning Framework for Transfer Learning in Low-Resource Neural Machine Translation

Yuan Gao, Feng Hou, Ruili Wang


Abstract
Existing transfer learning methods for neural machine translation typically use a well-trained translation model (i.e., a parent model) of a high-resource language pair to directly initialize a translation model (i.e., a child model) of a low-resource language pair, and the child model is then fine-tuned with corresponding datasets. In this paper, we propose a novel two-step fine-tuning (TSFT) framework for transfer learning in low-resource neural machine translation. In the first step, we adjust the parameters of the parent model to fit the child language by using the child source data. In the second step, we transfer the adjusted parameters to the child model and fine-tune it with a proposed distillation loss for efficient optimization. Our experimental results on five low-resource translations demonstrate that our framework yields significant improvements over various strong transfer learning baselines. Further analysis demonstrated the effectiveness of different components in our framework.
Anthology ID:
2024.findings-naacl.203
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3214–3224
Language:
URL:
https://aclanthology.org/2024.findings-naacl.203
DOI:
Bibkey:
Cite (ACL):
Yuan Gao, Feng Hou, and Ruili Wang. 2024. A Novel Two-step Fine-tuning Framework for Transfer Learning in Low-Resource Neural Machine Translation. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 3214–3224, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
A Novel Two-step Fine-tuning Framework for Transfer Learning in Low-Resource Neural Machine Translation (Gao et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.203.pdf
Copyright:
 2024.findings-naacl.203.copyright.pdf