Learning Non-Monotonic Automatic Post-Editing of Translations from Human Orderings

António Góis, Kyunghyun Cho, André Martins


Abstract
Recent research in neural machine translation has explored flexible generation orders, as an alternative to left-to-right generation. However, training non-monotonic models brings a new complication: how to search for a good ordering when there is a combinatorial explosion of orderings arriving at the same final result? Also, how do these automatic orderings compare with the actual behaviour of human translators? Current models rely on manually built biases or are left to explore all possibilities on their own. In this paper, we analyze the orderings produced by human post-editors and use them to train an automatic post-editing system. We compare the resulting system with those trained with left-to-right and random post-editing orderings. We observe that humans tend to follow a nearly left-to-right order, but with interesting deviations, such as preferring to start by correcting punctuation or verbs.
Anthology ID:
2020.eamt-1.22
Volume:
Proceedings of the 22nd Annual Conference of the European Association for Machine Translation
Month:
November
Year:
2020
Address:
Lisboa, Portugal
Editors:
André Martins, Helena Moniz, Sara Fumega, Bruno Martins, Fernando Batista, Luisa Coheur, Carla Parra, Isabel Trancoso, Marco Turchi, Arianna Bisazza, Joss Moorkens, Ana Guerberof, Mary Nurminen, Lena Marg, Mikel L. Forcada
Venue:
EAMT
SIG:
Publisher:
European Association for Machine Translation
Note:
Pages:
205–214
Language:
URL:
https://aclanthology.org/2020.eamt-1.22
DOI:
Bibkey:
Cite (ACL):
António Góis, Kyunghyun Cho, and André Martins. 2020. Learning Non-Monotonic Automatic Post-Editing of Translations from Human Orderings. In Proceedings of the 22nd Annual Conference of the European Association for Machine Translation, pages 205–214, Lisboa, Portugal. European Association for Machine Translation.
Cite (Informal):
Learning Non-Monotonic Automatic Post-Editing of Translations from Human Orderings (Góis et al., EAMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.eamt-1.22.pdf
Code
 antoniogois/keystrokes_ape
Data
APE