Difficulty-Aware Machine Translation Evaluation

Runzhe Zhan, Xuebo Liu, Derek F. Wong, Lidia S. Chao


Abstract
The high-quality translation results produced by machine translation (MT) systems still pose a huge challenge for automatic evaluation. Current MT evaluation pays the same attention to each sentence component, while the questions of real-world examinations (e.g., university examinations) have different difficulties and weightings. In this paper, we propose a novel difficulty-aware MT evaluation metric, expanding the evaluation dimension by taking translation difficulty into consideration. A translation that fails to be predicted by most MT systems will be treated as a difficult one and assigned a large weight in the final score function, and conversely. Experimental results on the WMT19 English-German Metrics shared tasks show that our proposed method outperforms commonly used MT metrics in terms of human correlation. In particular, our proposed method performs well even when all the MT systems are very competitive, which is when most existing metrics fail to distinguish between them. The source code is freely available at https://github.com/NLP2CT/Difficulty-Aware-MT-Evaluation.
Anthology ID:
2021.acl-short.5
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
26–32
Language:
URL:
https://aclanthology.org/2021.acl-short.5
DOI:
10.18653/v1/2021.acl-short.5
Bibkey:
Cite (ACL):
Runzhe Zhan, Xuebo Liu, Derek F. Wong, and Lidia S. Chao. 2021. Difficulty-Aware Machine Translation Evaluation. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 26–32, Online. Association for Computational Linguistics.
Cite (Informal):
Difficulty-Aware Machine Translation Evaluation (Zhan et al., ACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.5.pdf
Video:
 https://aclanthology.org/2021.acl-short.5.mp4
Code
 NLP2CT/Difficulty-Aware-MT-Evaluation