Combination of Neural Machine Translation Systems at WMT20

Benjamin Marie, Raphael Rubino, Atsushi Fujita


Abstract
This paper presents neural machine translation systems and their combination built for the WMT20 English-Polish and Japanese->English translation tasks. We show that using a Transformer Big architecture, additional training data synthesized from monolingual data, and combining many NMT systems through n-best list reranking improve translation quality. However, while we observed such improvements on the validation data, we did not observed similar improvements on the test data. Our analysis reveals that the presence of translationese texts in the validation data led us to take decisions in building NMT systems that were not optimal to obtain the best results on the test data.
Anthology ID:
2020.wmt-1.23
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
230–238
Language:
URL:
https://aclanthology.org/2020.wmt-1.23
DOI:
Bibkey:
Cite (ACL):
Benjamin Marie, Raphael Rubino, and Atsushi Fujita. 2020. Combination of Neural Machine Translation Systems at WMT20. In Proceedings of the Fifth Conference on Machine Translation, pages 230–238, Online. Association for Computational Linguistics.
Cite (Informal):
Combination of Neural Machine Translation Systems at WMT20 (Marie et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.23.pdf
Video:
 https://slideslive.com/38939600