Improved Training of Deep Text Clustering

Zonghao Yang, Wenpeng Hu, Yushan Tan, Zhunchen Luo


Abstract
The classical deep clustering optimization methods basically leverage information such as clustering centers, mutual information, and distance metrics to construct implicit generalized labels to establish information feedback (weak supervision) and thus optimize the deep model. However, the resulting generalized labels have different degrees of errors in the whole clustering process due to the limitation of clustering accuracy, which greatly interferes with the clustering process. To this end, this paper proposes a general deep clustering optimization method from the perspective of empirical risk minimization, using the correlation relationship between the samples. Experiments on two classical deep clustering methods demonstrate the necessity and effectiveness of the method. Code is available at https://github.com/yangzonghao1024/DCGLU.
Anthology ID:
2023.findings-emnlp.163
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2490–2499
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.163
DOI:
10.18653/v1/2023.findings-emnlp.163
Bibkey:
Cite (ACL):
Zonghao Yang, Wenpeng Hu, Yushan Tan, and Zhunchen Luo. 2023. Improved Training of Deep Text Clustering. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 2490–2499, Singapore. Association for Computational Linguistics.
Cite (Informal):
Improved Training of Deep Text Clustering (Yang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.163.pdf