Attention-Seeker: Dynamic Self-Attention Scoring for Unsupervised Keyphrase Extraction

Erwin Daniel Lopez Zapata, Cheng Tang, Atsushi Shimada


Abstract
This paper proposes Attention-Seeker, an unsupervised keyphrase extraction method that leverages self-attention maps from a Large Language Model to estimate the importance of candidate phrases. Our approach identifies specific components – such as layers, heads, and attention vectors – where the model pays significant attention to the key topics of the text. The attention weights provided by these components are then used to score the candidate phrases. Unlike previous models that require manual tuning of parameters (e.g., selection of heads, prompts, hyperparameters), Attention-Seeker dynamically adapts to the input text without any manual adjustments, enhancing its practical applicability. We evaluate Attention-Seeker on four publicly available datasets: Inspec, SemEval2010, SemEval2017, and Krapivin. Our results demonstrate that, even without parameter tuning, Attention-Seeker outperforms most baseline models, achieving state-of-the-art performance on three out of four datasets, particularly excelling in extracting keyphrases from long documents.
Anthology ID:
2025.coling-main.335
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5011–5026
Language:
URL:
https://aclanthology.org/2025.coling-main.335/
DOI:
Bibkey:
Cite (ACL):
Erwin Daniel Lopez Zapata, Cheng Tang, and Atsushi Shimada. 2025. Attention-Seeker: Dynamic Self-Attention Scoring for Unsupervised Keyphrase Extraction. In Proceedings of the 31st International Conference on Computational Linguistics, pages 5011–5026, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Attention-Seeker: Dynamic Self-Attention Scoring for Unsupervised Keyphrase Extraction (Lopez Zapata et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.335.pdf