%0 Conference Proceedings %T Unsupervised Neural Machine Translation with Universal Grammar %A Li, Zuchao %A Utiyama, Masao %A Sumita, Eiichiro %A Zhao, Hai %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F li-etal-2021-unsupervised-neural %X Machine translation usually relies on parallel corpora to provide parallel signals for training. The advent of unsupervised machine translation has brought machine translation away from this reliance, though performance still lags behind traditional supervised machine translation. In unsupervised machine translation, the model seeks symmetric language similarities as a source of weak parallel signal to achieve translation. Chomsky’s Universal Grammar theory postulates that grammar is an innate form of knowledge to humans and is governed by universal principles and constraints. Therefore, in this paper, we seek to leverage such shared grammar clues to provide more explicit language parallel signals to enhance the training of unsupervised machine translation models. Through experiments on multiple typical language pairs, we demonstrate the effectiveness of our proposed approaches. %R 10.18653/v1/2021.emnlp-main.261 %U https://aclanthology.org/2021.emnlp-main.261 %U https://doi.org/10.18653/v1/2021.emnlp-main.261 %P 3249-3264