Advances and Challenges in Unsupervised Neural Machine Translation

Rui Wang, Hai Zhao


Abstract
Unsupervised cross-lingual language representation initialization methods, together with mechanisms such as denoising and back-translation, have advanced unsupervised neural machine translation (UNMT), which has achieved impressive results. Meanwhile, there are still several challenges for UNMT. This tutorial first introduces the background and the latest progress of UNMT. We then examine a number of challenges to UNMT and give empirical results on how well the technology currently holds up.
Anthology ID:
2021.eacl-tutorials.5
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Tutorial Abstracts
Month:
April
Year:
2021
Address:
online
Editors:
Isabelle Augenstein, Ivan Habernal
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17–21
Language:
URL:
https://aclanthology.org/2021.eacl-tutorials.5
DOI:
10.18653/v1/2021.eacl-tutorials.5
Bibkey:
Cite (ACL):
Rui Wang and Hai Zhao. 2021. Advances and Challenges in Unsupervised Neural Machine Translation. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Tutorial Abstracts, pages 17–21, online. Association for Computational Linguistics.
Cite (Informal):
Advances and Challenges in Unsupervised Neural Machine Translation (Wang & Zhao, EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-tutorials.5.pdf