Correction of Errors in Preference Ratings from Automated Metrics for Text Generation

Jan Deriu, Pius von Däniken, Don Tuggener, Mark Cieliebak


Abstract
A major challenge in the field of Text Generation is evaluation: Human evaluations are cost-intensive, and automated metrics often display considerable disagreements with human judgments. In this paper, we propose to apply automated metrics for Text Generation in a preference-based evaluation protocol. The protocol features a statistical model that incorporates various levels of uncertainty to account for the error-proneness of the metrics. We show that existing metrics are generally over-confident in assigning significant differences between systems. As a remedy, the model allows to combine human ratings with automated ratings. We show that it can reduce the required amounts of human ratings to arrive at robust and statistically significant results by more than 50%, while yielding the same evaluation outcome as the pure human evaluation in 95% of cases. We showcase the benefits of the evaluation protocol for three text generation tasks: dialogue systems, machine translation, and text summarization.
Anthology ID:
2023.findings-acl.404
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6456–6474
Language:
URL:
https://aclanthology.org/2023.findings-acl.404
DOI:
10.18653/v1/2023.findings-acl.404
Bibkey:
Cite (ACL):
Jan Deriu, Pius von Däniken, Don Tuggener, and Mark Cieliebak. 2023. Correction of Errors in Preference Ratings from Automated Metrics for Text Generation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 6456–6474, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Correction of Errors in Preference Ratings from Automated Metrics for Text Generation (Deriu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.404.pdf
Video:
 https://aclanthology.org/2023.findings-acl.404.mp4