Semantic Neural Machine Translation Using AMR

Linfeng Song, Daniel Gildea, Yue Zhang, Zhiguo Wang, Jinsong Su


Abstract
It is intuitive that semantic representations can be useful for machine translation, mainly because they can help in enforcing meaning preservation and handling data sparsity (many sentences correspond to one meaning) of machine translation models. On the other hand, little work has been done on leveraging semantics for neural machine translation (NMT). In this work, we study the usefulness of AMR (abstract meaning representation) on NMT. Experiments on a standard English-to-German dataset show that incorporating AMR as additional knowledge can significantly improve a strong attention-based sequence-to-sequence neural translation model.
Anthology ID:
Q19-1002
Volume:
Transactions of the Association for Computational Linguistics, Volume 7
Month:
Year:
2019
Address:
Cambridge, MA
Editors:
Lillian Lee, Mark Johnson, Brian Roark, Ani Nenkova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
19–31
Language:
URL:
https://aclanthology.org/Q19-1002
DOI:
10.1162/tacl_a_00252
Bibkey:
Cite (ACL):
Linfeng Song, Daniel Gildea, Yue Zhang, Zhiguo Wang, and Jinsong Su. 2019. Semantic Neural Machine Translation Using AMR. Transactions of the Association for Computational Linguistics, 7:19–31.
Cite (Informal):
Semantic Neural Machine Translation Using AMR (Song et al., TACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/Q19-1002.pdf
Code
 freesunshine0316/semantic-nmt