Neural Attention-Aware Hierarchical Topic Model

Yuan Jin, He Zhao, Ming Liu, Lan Du, Wray Buntine


Abstract
Neural topic models (NTMs) apply deep neural networks to topic modelling. Despite their success, NTMs generally ignore two important aspects: (1) only document-level word count information is utilized for the training, while more fine-grained sentence-level information is ignored, and (2) external semantic knowledge regarding documents, sentences and words are not exploited for the training. To address these issues, we propose a variational autoencoder (VAE) NTM model that jointly reconstructs the sentence and document word counts using combinations of bag-of-words (BoW) topical embeddings and pre-trained semantic embeddings. The pre-trained embeddings are first transformed into a common latent topical space to align their semantics with the BoW embeddings. Our model also features hierarchical KL divergence to leverage embeddings of each document to regularize those of their sentences, paying more attention to semantically relevant sentences. Both quantitative and qualitative experiments have shown the efficacy of our model in 1) lowering the reconstruction errors at both the sentence and document levels, and 2) discovering more coherent topics from real-world datasets.
Anthology ID:
2021.emnlp-main.80
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1042–1052
Language:
URL:
https://aclanthology.org/2021.emnlp-main.80
DOI:
10.18653/v1/2021.emnlp-main.80
Bibkey:
Cite (ACL):
Yuan Jin, He Zhao, Ming Liu, Lan Du, and Wray Buntine. 2021. Neural Attention-Aware Hierarchical Topic Model. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 1042–1052, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Neural Attention-Aware Hierarchical Topic Model (Jin et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.80.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.80.mp4