Raphaël Esamotunu


2024

pdf bib
Translate your Own: a Post-Editing Experiment in the NLP domain
Rachel Bawden | Ziqian Peng | Maud Bénard | Éric Clergerie | Raphaël Esamotunu | Mathilde Huguin | Natalie Kübler | Alexandra Mestivier | Mona Michelot | Laurent Romary | Lichao Zhu | François Yvon
Proceedings of the 25th Annual Conference of the European Association for Machine Translation (Volume 1)

The improvements in neural machine translation make translation and post-editing pipelines ever more effective for a wider range of applications. In this paper, we evaluate the effectiveness of such a pipeline for the translation of scientific documents (limited here to article abstracts). Using a dedicated interface, we collect, then analyse the post-edits of approximately 350 abstracts (English→French) in the Natural Language Processing domain for two groups of post-editors: domain experts (academics encouraged to post-edit their own articles) on the one hand and trained translators on the other. Our results confirm that such pipelines can be effective, at least for high-resource language pairs. They also highlight the difference in the post-editing strategy of the two subgroups. Finally, they suggest that working on term translation is the most pressing issue to improve fully automatic translations, but that in a post-editing setup, other error types can be equally annoying for post-editors.