Inducing Grammars with and for Neural Machine Translation

Yonatan Bisk, Ke Tran


Abstract
Machine translation systems require semantic knowledge and grammatical understanding. Neural machine translation (NMT) systems often assume this information is captured by an attention mechanism and a decoder that ensures fluency. Recent work has shown that incorporating explicit syntax alleviates the burden of modeling both types of knowledge. However, requiring parses is expensive and does not explore the question of what syntax a model needs during translation. To address both of these issues we introduce a model that simultaneously translates while inducing dependency trees. In this way, we leverage the benefits of structure while investigating what syntax NMT must induce to maximize performance. We show that our dependency trees are 1. language pair dependent and 2. improve translation quality.
Anthology ID:
W18-2704
Volume:
Proceedings of the 2nd Workshop on Neural Machine Translation and Generation
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Alexandra Birch, Andrew Finch, Thang Luong, Graham Neubig, Yusuke Oda
Venue:
NGT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
25–35
Language:
URL:
https://aclanthology.org/W18-2704
DOI:
10.18653/v1/W18-2704
Bibkey:
Cite (ACL):
Yonatan Bisk and Ke Tran. 2018. Inducing Grammars with and for Neural Machine Translation. In Proceedings of the 2nd Workshop on Neural Machine Translation and Generation, pages 25–35, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Inducing Grammars with and for Neural Machine Translation (Bisk & Tran, NGT 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-2704.pdf