SPACL: Shared-Private Architecture based on Contrastive Learning for Multi-domain Text Classification

Xiong Guoding, Zhou Yongmei, Wang Deheng, Ouyang Zhouhao


Abstract
“With the development of deep learning in recent years, text classification research has achieved remarkable results. However, text classification task often requires a large amount of annotated data, and data in different fields often force the model to learn different knowledge. It is often difficult for models to distinguish data labeled in different domains. Sometimes data from different domains can even damage the classification ability of the model and reduce the overall performance of the model. To address these issues, we propose a shared-private architecture based on contrastive learning for multi-domain text classification which can improve both the accuracy and robustness of classifiers. Extensive experiments are conducted on two public datasets. The results of experiments show that the our approach achieves the state-of-the-art performance in multi-domain text classification.”
Anthology ID:
2022.ccl-1.84
Volume:
Proceedings of the 21st Chinese National Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Nanchang, China
Editors:
Maosong Sun (孙茂松), Yang Liu (刘洋), Wanxiang Che (车万翔), Yang Feng (冯洋), Xipeng Qiu (邱锡鹏), Gaoqi Rao (饶高琦), Yubo Chen (陈玉博)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
958–965
Language:
English
URL:
https://aclanthology.org/2022.ccl-1.84
DOI:
Bibkey:
Cite (ACL):
Xiong Guoding, Zhou Yongmei, Wang Deheng, and Ouyang Zhouhao. 2022. SPACL: Shared-Private Architecture based on Contrastive Learning for Multi-domain Text Classification. In Proceedings of the 21st Chinese National Conference on Computational Linguistics, pages 958–965, Nanchang, China. Chinese Information Processing Society of China.
Cite (Informal):
SPACL: Shared-Private Architecture based on Contrastive Learning for Multi-domain Text Classification (Guoding et al., CCL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.ccl-1.84.pdf