Neural Machine Translation Quality and Post-Editing Performance

Vilém Zouhar, Martin Popel, Ondřej Bojar, Aleš Tamchyna


Abstract
We test the natural expectation that using MT in professional translation saves human processing time. The last such study was carried out by Sanchez-Torron and Koehn (2016) with phrase-based MT, artificially reducing the translation quality. In contrast, we focus on neural MT (NMT) of high quality, which has become the state-of-the-art approach since then and also got adopted by most translation companies. Through an experimental study involving over 30 professional translators for English -> Czech translation, we examine the relationship between NMT performance and post-editing time and quality. Across all models, we found that better MT systems indeed lead to fewer changes in the sentences in this industry setting. The relation between system quality and post-editing time is however not straightforward and, contrary to the results on phrase-based MT, BLEU is definitely not a stable predictor of the time or final output quality.
Anthology ID:
2021.emnlp-main.801
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10204–10214
Language:
URL:
https://aclanthology.org/2021.emnlp-main.801
DOI:
10.18653/v1/2021.emnlp-main.801
Bibkey:
Cite (ACL):
Vilém Zouhar, Martin Popel, Ondřej Bojar, and Aleš Tamchyna. 2021. Neural Machine Translation Quality and Post-Editing Performance. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 10204–10214, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Neural Machine Translation Quality and Post-Editing Performance (Zouhar et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.801.pdf
Software:
 2021.emnlp-main.801.Software.zip
Video:
 https://aclanthology.org/2021.emnlp-main.801.mp4