Seed-Guided Topic Discovery with Out-of-Vocabulary Seeds

Yu Zhang, Yu Meng, Xuan Wang, Sheng Wang, Jiawei Han


Abstract
Discovering latent topics from text corpora has been studied for decades. Many existing topic models adopt a fully unsupervised setting, and their discovered topics may not cater to users’ particular interests due to their inability of leveraging user guidance. Although there exist seed-guided topic discovery approaches that leverage user-provided seeds to discover topic-representative terms, they are less concerned with two factors: (1) the existence of out-of-vocabulary seeds and (2) the power of pre-trained language models (PLMs). In this paper, we generalize the task of seed-guided topic discovery to allow out-of-vocabulary seeds. We propose a novel framework, named SeeTopic, wherein the general knowledge of PLMs and the local semantics learned from the input corpus can mutually benefit each other. Experiments on three real datasets from different domains demonstrate the effectiveness of SeeTopic in terms of topic coherence, accuracy, and diversity.
Anthology ID:
2022.naacl-main.21
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
279–290
Language:
URL:
https://aclanthology.org/2022.naacl-main.21
DOI:
10.18653/v1/2022.naacl-main.21
Bibkey:
Cite (ACL):
Yu Zhang, Yu Meng, Xuan Wang, Sheng Wang, and Jiawei Han. 2022. Seed-Guided Topic Discovery with Out-of-Vocabulary Seeds. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 279–290, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Seed-Guided Topic Discovery with Out-of-Vocabulary Seeds (Zhang et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.21.pdf
Video:
 https://aclanthology.org/2022.naacl-main.21.mp4
Code
 yuzhimanhua/seetopic
Data
SciDocs