Dinh Viet Sang
2025
Multi-Surrogate-Objective Optimization for Neural Topic Models
Tue Le
|
Hoang Tran Vuong
|
Tung Nguyen
|
Linh Ngo Van
|
Dinh Viet Sang
|
Trung Le
|
Thien Huu Nguyen
Findings of the Association for Computational Linguistics: EMNLP 2025
Neural topic modeling has substantially improved topic quality and document topic distribution compared to traditional probabilistic methods. These models often incorporate multiple loss functions. However, the disparate magnitudes of these losses can make hyperparameter tuning for these loss functions challenging, potentially creating obstacles for simultaneous optimization. While gradient-based Multi-objective Optimization (MOO) algorithms offer a potential solution, they are typically applied to shared parameters in multi-task learning, hindering their broader adoption, particularly in Neural Topic Models (NTMs). Furthermore, our experiments reveal that naïve MOO applications on NTMs can yield suboptimal results, even underperforming compared to implementations without the MOO mechanism. This paper proposes a novel approach to integrate MOO algorithms, independent of hard-parameter sharing architectures, and effectively optimizes multiple NTMs loss functions. Comprehensive evaluations on widely used benchmark datasets demonstrate that our approach significantly enhances baseline topic model performance and outperforms direct MOO applications on NTMs.
XTRA: Cross-Lingual Topic Modeling with Topic and Representation Alignments
Tien Phat Nguyen
|
Ngo Vu Minh
|
Tung Nguyen
|
Linh Ngo Van
|
Duc Anh Nguyen
|
Dinh Viet Sang
|
Trung Le
Findings of the Association for Computational Linguistics: EMNLP 2025
Cross-lingual topic modeling aims to uncover shared semantic themes across languages. Several methods have been proposed to address this problem, leveraging both traditional and neural approaches. While previous methods have achieved some improvements in topic diversity, they often struggle to ensure high topic coherence and consistent alignment across languages. We propose XTRA (Cross-Lingual Topic Modeling with Topic and Representation Alignments), a novel framework that unifies Bag-of-Words modeling with multilingual embeddings. XTRA introduces two core components: (1) representation alignment, aligning document-topic distributions via contrastive learning in a shared semantic space; and (2) topic alignment, projecting topic-word distributions into the same space to enforce cross-lingual consistency. This dual mechanism enables XTRA to learn topics that are interpretable (coherent and diverse) and well-aligned across languages. Experiments on multilingual corpora confirm that XTRA significantly outperforms strong baselines in topic coherence, diversity, and alignment quality.
Search
Fix author
Co-authors
- Trung Le 2
- Tung Nguyen 2
- Linh Ngo Van 2
- Tue Le 1
- Ngo Vu Minh 1
- show all...