%0 Conference Proceedings %T A Multilingual View of Unsupervised Machine Translation %A Garcia, Xavier %A Foret, Pierre %A Sellam, Thibault %A Parikh, Ankur %Y Cohn, Trevor %Y He, Yulan %Y Liu, Yang %S Findings of the Association for Computational Linguistics: EMNLP 2020 %D 2020 %8 November %I Association for Computational Linguistics %C Online %F garcia-etal-2020-multilingual %X We present a probabilistic framework for multilingual neural machine translation that encompasses supervised and unsupervised setups, focusing on unsupervised translation. In addition to studying the vanilla case where there is only monolingual data available, we propose a novel setup where one language in the (source, target) pair is not associated with any parallel data, but there may exist auxiliary parallel data that contains the other. This auxiliary data can naturally be utilized in our probabilistic framework via a novel cross-translation loss term. Empirically, we show that our approach results in higher BLEU scores over state-of-the-art unsupervised models on the WMT’14 English-French, WMT’16 English-German, and WMT’16 English-Romanian datasets in most directions. %R 10.18653/v1/2020.findings-emnlp.283 %U https://aclanthology.org/2020.findings-emnlp.283 %U https://doi.org/10.18653/v1/2020.findings-emnlp.283 %P 3160-3170