Assessing the Ability of Neural Machine Translation Models to Perform Syntactic Rewriting

Jahkel Robin, Alvin Grissom II, Matthew Roselli


Abstract
We describe work in progress for evaluating performance of sequence-to-sequence neural networks on the task of syntax-based reordering for rules applicable to simultaneous machine translation. We train models that attempt to rewrite English sentences using rules that are commonly used by human interpreters. We examine the performance of these models to determine which forms of rewriting are more difficult for them to learn and which architectures are the best at learning them.
Anthology ID:
W19-3648
Volume:
Proceedings of the 2019 Workshop on Widening NLP
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Amittai Axelrod, Diyi Yang, Rossana Cunha, Samira Shaikh, Zeerak Waseem
Venue:
WiNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
152
Language:
URL:
https://aclanthology.org/W19-3648
DOI:
Bibkey:
Cite (ACL):
Jahkel Robin, Alvin Grissom II, and Matthew Roselli. 2019. Assessing the Ability of Neural Machine Translation Models to Perform Syntactic Rewriting. In Proceedings of the 2019 Workshop on Widening NLP, page 152, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Assessing the Ability of Neural Machine Translation Models to Perform Syntactic Rewriting (Robin et al., WiNLP 2019)
Copy Citation: