An Effective Deployment of Contrastive Learning in Multi-label Text Classification

Nankai Lin, Guanqiu Qin, Gang Wang, Dong Zhou, Aimin Yang


Abstract
The effectiveness of contrastive learning technology in natural language processing tasks is yet to be explored and analyzed. How to construct positive and negative samples correctly and reasonably is the core challenge of contrastive learning. It is even harder to discover contrastive objects in multi-label text classification tasks. There are very few contrastive losses proposed previously. In this paper, we investigate the problem from a different angle by proposing five novel contrastive losses for multi-label text classification tasks. These are Strict Contrastive Loss (SCL), Intra-label Contrastive Loss (ICL), Jaccard Similarity Contrastive Loss (JSCL), Jaccard Similarity Probability Contrastive Loss (JSPCL), and Stepwise Label Contrastive Loss (SLCL). We explore the effectiveness of contrastive learning for multi-label text classification tasks by the employment of these novel losses and provide a set of baseline models for deploying contrastive learning techniques on specific tasks. We further perform an interpretable analysis of our approach to show how different components of contrastive learning losses play their roles. The experimental results show that our proposed contrastive losses can bring improvement to multi-label text classification tasks. Our work also explores how contrastive learning should be adapted for multi-label text classification tasks.
Anthology ID:
2023.findings-acl.556
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8730–8744
Language:
URL:
https://aclanthology.org/2023.findings-acl.556
DOI:
10.18653/v1/2023.findings-acl.556
Bibkey:
Cite (ACL):
Nankai Lin, Guanqiu Qin, Gang Wang, Dong Zhou, and Aimin Yang. 2023. An Effective Deployment of Contrastive Learning in Multi-label Text Classification. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8730–8744, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
An Effective Deployment of Contrastive Learning in Multi-label Text Classification (Lin et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.556.pdf