KPEval: Towards Fine-Grained Semantic-Based Keyphrase Evaluation

Di Wu, Da Yin, Kai-Wei Chang


Abstract
Despite the significant advancements in keyphrase extraction and keyphrase generation methods, the predominant approach for evaluation mainly relies on exact matching with human references. This scheme fails to recognize systems that generate keyphrases semantically equivalent to the references or diverse keyphrases that carry practical utility. To better assess the capability of keyphrase systems, we propose KPEval, a comprehensive evaluation framework consisting of four critical aspects: reference agreement, faithfulness, diversity, and utility. For each aspect, we design semantic-based metrics to reflect the evaluation objectives. Meta-evaluation studies demonstrate that our evaluation strategy correlates better with human preferences compared to a range of previously proposed metrics. Using KPEval, we re-evaluate 23 keyphrase systems and discover that (1) established model comparison results have blind-spots especially when considering reference-free evaluation; (2) large language models are underestimated by prior evaluation works; and (3) there is no single best model that can excel in all the aspects.
Anthology ID:
2024.findings-acl.117
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1959–1981
Language:
URL:
https://aclanthology.org/2024.findings-acl.117
DOI:
Bibkey:
Cite (ACL):
Di Wu, Da Yin, and Kai-Wei Chang. 2024. KPEval: Towards Fine-Grained Semantic-Based Keyphrase Evaluation. In Findings of the Association for Computational Linguistics ACL 2024, pages 1959–1981, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
KPEval: Towards Fine-Grained Semantic-Based Keyphrase Evaluation (Wu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.117.pdf