Cross-lingual Supervision Improves Unsupervised Neural Machine Translation

Mingxuan Wang, Hongxiao Bai, Hai Zhao, Lei Li


Abstract
We propose to improve unsupervised neural machine translation with cross-lingual supervision (), which utilizes supervision signals from high resource language pairs to improve the translation of zero-source languages. Specifically, for training En-Ro system without parallel corpus, we can leverage the corpus from En-Fr and En-De to collectively train the translation from one language into many languages under one model. % is based on multilingual models which require no changes to the standard unsupervised NMT. Simple and effective, significantly improves the translation quality with a big margin in the benchmark unsupervised translation tasks, and even achieves comparable performance to supervised NMT. In particular, on WMT’14 -tasks achieves 37.6 and 35.18 BLEU score, which is very close to the large scale supervised setting and on WMT’16 -tasks achieves 35.09 BLEU score which is even better than the supervised Transformer baseline.
Anthology ID:
2021.naacl-industry.12
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers
Month:
June
Year:
2021
Address:
Online
Editors:
Young-bum Kim, Yunyao Li, Owen Rambow
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
89–96
Language:
URL:
https://aclanthology.org/2021.naacl-industry.12
DOI:
10.18653/v1/2021.naacl-industry.12
Bibkey:
Cite (ACL):
Mingxuan Wang, Hongxiao Bai, Hai Zhao, and Lei Li. 2021. Cross-lingual Supervision Improves Unsupervised Neural Machine Translation. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers, pages 89–96, Online. Association for Computational Linguistics.
Cite (Informal):
Cross-lingual Supervision Improves Unsupervised Neural Machine Translation (Wang et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-industry.12.pdf
Video:
 https://aclanthology.org/2021.naacl-industry.12.mp4