Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations

Jiatao Gu, Yong Wang, Kyunghyun Cho, Victor O.K. Li


Abstract
Zero-shot translation, translating between language pairs on which a Neural Machine Translation (NMT) system has never been trained, is an emergent property when training the system in multilingual settings. However, naive training for zero-shot NMT easily fails, and is sensitive to hyper-parameter setting. The performance typically lags far behind the more conventional pivot-based approach which translates twice using a third language as a pivot. In this work, we address the degeneracy problem due to capturing spurious correlations by quantitatively analyzing the mutual information between language IDs of the source and decoded sentences. Inspired by this analysis, we propose to use two simple but effective approaches: (1) decoder pre-training; (2) back-translation. These methods show significant improvement (4 22 BLEU points) over the vanilla zero-shot translation on three challenging multilingual datasets, and achieve similar or better results than the pivot-based approach.
Anthology ID:
P19-1121
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1258–1268
Language:
URL:
https://aclanthology.org/P19-1121
DOI:
10.18653/v1/P19-1121
Bibkey:
Cite (ACL):
Jiatao Gu, Yong Wang, Kyunghyun Cho, and Victor O.K. Li. 2019. Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1258–1268, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations (Gu et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1121.pdf
Data
Europarl