Topic Balancing with Additive Regularization of Topic Models

Eugeniia Veselova, Konstantin Vorontsov


Abstract
This article proposes a new approach for building topic models on unbalanced collections in topic modelling, based on the existing methods and our experiments with such methods. Real-world data collections contain topics in various proportions, and often documents of the relatively small theme become distributed all over the larger topics instead of being grouped into one topic. To address this issue, we design a new regularizer for Theta and Phi matrices in probabilistic Latent Semantic Analysis (pLSA) model. We make sure this regularizer increases the quality of topic models, trained on unbalanced collections. Besides, we conceptually support this regularizer by our experiments.
Anthology ID:
2020.acl-srw.9
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2020
Address:
Online
Editors:
Shruti Rijhwani, Jiangming Liu, Yizhong Wang, Rotem Dror
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
59–65
Language:
URL:
https://aclanthology.org/2020.acl-srw.9
DOI:
10.18653/v1/2020.acl-srw.9
Bibkey:
Cite (ACL):
Eugeniia Veselova and Konstantin Vorontsov. 2020. Topic Balancing with Additive Regularization of Topic Models. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 59–65, Online. Association for Computational Linguistics.
Cite (Informal):
Topic Balancing with Additive Regularization of Topic Models (Veselova & Vorontsov, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-srw.9.pdf
Video:
 http://slideslive.com/38928633