Scale-Invariant Infinite Hierarchical Topic Model

Shusei Eshima, Daichi Mochihashi


Abstract
Hierarchical topic models have been employed to organize a large number of diverse topics from corpora into a latent tree structure. However, existing models yield fragmented topics with overlapping themes whose expected probability becomes exponentially smaller along the depth of the tree. To solve this intrinsic problem, we propose a scale-invariant infinite hierarchical topic model (ihLDA). The ihLDA adaptively adjusts the topic creation to make the expected topic probability decay considerably slower than that in existing models. Thus, it facilitates the estimation of deeper topic structures encompassing diverse topics in a corpus. Furthermore, the ihLDA extends a widely used tree-structured prior (Adams et al., 2010) in a hierarchical Bayesian way, which enables drawing an infinite topic tree from the base tree while efficiently sampling the topic assignments for the words. Experiments demonstrate that the ihLDA has better topic uniqueness and hierarchical diversity thanexisting approaches, including state-of-the-art neural models.
Anthology ID:
2023.findings-acl.745
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11731–11746
Language:
URL:
https://aclanthology.org/2023.findings-acl.745
DOI:
10.18653/v1/2023.findings-acl.745
Bibkey:
Cite (ACL):
Shusei Eshima and Daichi Mochihashi. 2023. Scale-Invariant Infinite Hierarchical Topic Model. In Findings of the Association for Computational Linguistics: ACL 2023, pages 11731–11746, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Scale-Invariant Infinite Hierarchical Topic Model (Eshima & Mochihashi, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.745.pdf