Improving Both Domain Robustness and Domain Adaptability in Machine Translation

Wen Lai, Jindřich Libovický, Alexander Fraser


Abstract
We consider two problems of NMT domain adaptation using meta-learning. First, we want to reach domain robustness, i.e., we want to reach high quality on both domains seen in the training data and unseen domains. Second, we want our systems to be adaptive, i.e., making it possible to finetune systems with just hundreds of in-domain parallel sentences. We study the domain adaptability of meta-learning when improving the domain robustness of the model. In this paper, we propose a novel approach, RMLNMT (Robust Meta-Learning Framework for Neural Machine Translation Domain Adaptation), which improves the robustness of existing meta-learning models. More specifically, we show how to use a domain classifier in curriculum learning and we integrate the word-level domain mixing model into the meta-learning framework with a balanced sampling strategy. Experiments on English-German and English-Chinese translation show that RMLNMT improves in terms of both domain robustness and domain adaptability in seen and unseen domains.
Anthology ID:
2022.coling-1.461
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5191–5204
Language:
URL:
https://aclanthology.org/2022.coling-1.461
DOI:
Bibkey:
Cite (ACL):
Wen Lai, Jindřich Libovický, and Alexander Fraser. 2022. Improving Both Domain Robustness and Domain Adaptability in Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5191–5204, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Improving Both Domain Robustness and Domain Adaptability in Machine Translation (Lai et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.461.pdf
Code
 lavine-lmu/rmlnmt