The Mininglamp Machine Translation System for WMT21

Shiyu Zhao, Xiaopu Li, Minghui Wu, Jie Hao


Abstract
This paper describes Mininglamp neural machine translation systems of the WMT2021 news translation tasks. We have participated in eight directions translation tasks for news text including Chinese to/from English, Hausa to/from English, German to/from English and French to/from German. Our fundamental system was based on Transformer architecture, with wider or smaller construction for different news translation tasks. We mainly utilized the method of back-translation, knowledge distillation and fine-tuning to boost single model, while the ensemble was used to combine single models. Our final submission has ranked first for the English to/from Hausa task.
Anthology ID:
2021.wmt-1.25
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Editors:
Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
260–264
Language:
URL:
https://aclanthology.org/2021.wmt-1.25
DOI:
Bibkey:
Cite (ACL):
Shiyu Zhao, Xiaopu Li, Minghui Wu, and Jie Hao. 2021. The Mininglamp Machine Translation System for WMT21. In Proceedings of the Sixth Conference on Machine Translation, pages 260–264, Online. Association for Computational Linguistics.
Cite (Informal):
The Mininglamp Machine Translation System for WMT21 (Zhao et al., WMT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wmt-1.25.pdf