Unsupervised Open-domain Keyphrase Generation

Lam Do, Pritom Saha Akash, Kevin Chen-Chuan Chang


Abstract
In this work, we study the problem of unsupervised open-domain keyphrase generation, where the objective is a keyphrase generation model that can be built without using human-labeled data and can perform consistently across domains. To solve this problem, we propose a seq2seq model that consists of two modules, namely phraseness and informativeness module, both of which can be built in an unsupervised and open-domain fashion. The phraseness module generates phrases, while the informativeness module guides the generation towards those that represent the core concepts of the text. We thoroughly evaluate our proposed method using eight benchmark datasets from different domains. Results on in-domain datasets show that our approach achieves state-of-the-art results compared with existing unsupervised models, and overall narrows the gap between supervised and unsupervised methods down to about 16%. Furthermore, we demonstrate that our model performs consistently across domains, as it surpasses the baselines on out-of-domain datasets.
Anthology ID:
2023.acl-long.592
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10614–10627
Language:
URL:
https://aclanthology.org/2023.acl-long.592
DOI:
10.18653/v1/2023.acl-long.592
Bibkey:
Cite (ACL):
Lam Do, Pritom Saha Akash, and Kevin Chen-Chuan Chang. 2023. Unsupervised Open-domain Keyphrase Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10614–10627, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Open-domain Keyphrase Generation (Do et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.592.pdf
Video:
 https://aclanthology.org/2023.acl-long.592.mp4