MiSS@WMT21: Contrastive Learning-reinforced Domain Adaptation in Neural Machine Translation

Zuchao Li, Masao Utiyama, Eiichiro Sumita, Hai Zhao


Abstract
In this paper, we describe our MiSS system that participated in the WMT21 news translation task. We mainly participated in the evaluation of the three translation directions of English-Chinese and Japanese-English translation tasks. In the systems submitted, we primarily considered wider networks, deeper networks, relative positional encoding, and dynamic convolutional networks in terms of model structure, while in terms of training, we investigated contrastive learning-reinforced domain adaptation, self-supervised training, and optimization objective switching training methods. According to the final evaluation results, a deeper, wider, and stronger network can improve translation performance in general, yet our data domain adaption method can improve performance even more. In addition, we found that switching to the use of our proposed objective during the finetune phase using relatively small domain-related data can effectively improve the stability of the model’s convergence and achieve better optimal performance.
Anthology ID:
2021.wmt-1.12
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Editors:
Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
154–161
Language:
URL:
https://aclanthology.org/2021.wmt-1.12
DOI:
Bibkey:
Cite (ACL):
Zuchao Li, Masao Utiyama, Eiichiro Sumita, and Hai Zhao. 2021. MiSS@WMT21: Contrastive Learning-reinforced Domain Adaptation in Neural Machine Translation. In Proceedings of the Sixth Conference on Machine Translation, pages 154–161, Online. Association for Computational Linguistics.
Cite (Informal):
MiSS@WMT21: Contrastive Learning-reinforced Domain Adaptation in Neural Machine Translation (Li et al., WMT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wmt-1.12.pdf