%0 Conference Proceedings %T Self-Supervised Neural Topic Modeling %A Bahrainian, Seyed Ali %A Jaggi, Martin %A Eickhoff, Carsten %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Findings of the Association for Computational Linguistics: EMNLP 2021 %D 2021 %8 November %I Association for Computational Linguistics %C Punta Cana, Dominican Republic %F bahrainian-etal-2021-self-supervised %X Topic models are useful tools for analyzing and interpreting the main underlying themes of large corpora of text. Most topic models rely on word co-occurrence for computing a topic, i.e., a weighted set of words that together represent a high-level semantic concept. In this paper, we propose a new light-weight Self-Supervised Neural Topic Model (SNTM) that learns a rich context by learning a topic representation jointly from three co-occurring words and a document that the triple originates from. Our experimental results indicate that our proposed neural topic model, SNTM, outperforms previously existing topic models in coherence metrics as well as document clustering accuracy. Moreover, apart from the topic coherence and clustering performance, the proposed neural topic model has a number of advantages, namely, being computationally efficient and easy to train. %R 10.18653/v1/2021.findings-emnlp.284 %U https://aclanthology.org/2021.findings-emnlp.284 %U https://doi.org/10.18653/v1/2021.findings-emnlp.284 %P 3341-3350