Rethinking Round-Trip Translation for Machine Translation Evaluation

Terry Yue Zhuo, Qiongkai Xu, Xuanli He, Trevor Cohn


Abstract
Automatic evaluation methods for translation often require model training, and thus the availability of parallel corpora limits their applicability to low-resource settings. Round-trip translation is a potential workaround, which can reframe bilingual evaluation into a much simpler monolingual task. Early results from the era of statistical machine translation (SMT) raised fundamental concerns about the utility of this approach, based on poor correlation with human translation quality judgments. In this paper, we revisit this technique with modern neural translation (NMT) and show that round-trip translation does allow for accurate automatic evaluation without the need for reference translations. These opposite findings can be explained through the copy mechanism in SMT that is absent in NMT. We demonstrate that round-trip translation benefits multiple machine translation evaluation tasks: i) predicting forward translation scores; ii) improving the performance of a quality estimation model; and iii) identifying adversarial competitors in shared tasks via cross-system verification.
Anthology ID:
2023.findings-acl.22
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
319–337
Language:
URL:
https://aclanthology.org/2023.findings-acl.22
DOI:
10.18653/v1/2023.findings-acl.22
Bibkey:
Cite (ACL):
Terry Yue Zhuo, Qiongkai Xu, Xuanli He, and Trevor Cohn. 2023. Rethinking Round-Trip Translation for Machine Translation Evaluation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 319–337, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Rethinking Round-Trip Translation for Machine Translation Evaluation (Zhuo et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.22.pdf