Re-Examining Summarization Evaluation across Multiple Quality Criteria

Ori Ernst, Ori Shapira, Ido Dagan, Ran Levy


Abstract
The common practice for assessing automatic evaluation metrics is to measure the correlation between their induced system rankings and those obtained by reliable human evaluation, where a higher correlation indicates a better metric. Yet, an intricate setting arises when an NLP task is evaluated by multiple Quality Criteria (QCs), like for text summarization where prominent criteria including relevance, consistency, fluency and coherence. In this paper, we challenge the soundness of this methodology when multiple QCs are involved, concretely for the summarization case. First, we show that the allegedly best metrics for certain QCs actually do not perform well, failing to detect even drastic summary corruptions with respect to the considered QC. To explain this, we show that some of the high correlations obtained in the multi-QC setup are spurious. Finally, we propose a procedure that may help detecting this effect. Overall, our findings highlight the need for further investigating metric evaluation methodologies for the multiple-QC case.
Anthology ID:
2023.findings-emnlp.924
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13829–13838
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.924
DOI:
10.18653/v1/2023.findings-emnlp.924
Bibkey:
Cite (ACL):
Ori Ernst, Ori Shapira, Ido Dagan, and Ran Levy. 2023. Re-Examining Summarization Evaluation across Multiple Quality Criteria. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13829–13838, Singapore. Association for Computational Linguistics.
Cite (Informal):
Re-Examining Summarization Evaluation across Multiple Quality Criteria (Ernst et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.924.pdf
Video:
 https://aclanthology.org/2023.findings-emnlp.924.mp4