%0 Conference Proceedings %T Modeling Text using the Continuous Space Topic Model with Pre-Trained Word Embeddings %A Inoue, Seiichi %A Aida, Taichi %A Komachi, Mamoru %A Asai, Manabu %Y Kabbara, Jad %Y Lin, Haitao %Y Paullada, Amandalynne %Y Vamvas, Jannis %S Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: Student Research Workshop %D 2021 %8 August %I Association for Computational Linguistics %C Online %F inoue-etal-2021-modeling %X In this study, we propose a model that extends the continuous space topic model (CSTM), which flexibly controls word probability in a document, using pre-trained word embeddings. To develop the proposed model, we pre-train word embeddings, which capture the semantics of words and plug them into the CSTM. Intrinsic experimental results show that the proposed model exhibits a superior performance over the CSTM in terms of perplexity and convergence speed. Furthermore, extrinsic experimental results show that the proposed model is useful for a document classification task when compared with the baseline model. We qualitatively show that the latent coordinates obtained by training the proposed model are better than those of the baseline model. %R 10.18653/v1/2021.acl-srw.15 %U https://aclanthology.org/2021.acl-srw.15 %U https://doi.org/10.18653/v1/2021.acl-srw.15 %P 138-147