Dual Contrastive Learning Framework for Incremental Text Classification

Yigong Wang, Zhuoyi Wang, Yu Lin, Jinghui Guo, Sadaf Halim, Latifur Khan


Abstract
Incremental learning plays a pivotal role in the context of online knowledge discovery, as it encourages large models (LM) to learn and refresh knowledge continuously. Many approaches have been proposed to simultaneously preserve knowledge from previous tasks while learning new concepts in online NLP applications. In this paper, we primarily focus on learning a more generalized embedding space that could be better transferred to various downstream sequence tasks. The key idea is to learn from both task-agnostic and task-specific embedding aspects so that the inherent challenge of catastrophic forgetting that arises in incremental learning scenarios can be addressed with a more generalized solution. We propose a dual contrastive learning (DCL) based framework to foster the transferability of representations across different tasks, it consists of two key components: firstly, we utilize global contrastive learning that intertwines a task-agnostic strategy for promoting a generalized embedding space; secondly, considering the domain shift from unseen distributions can compromise the quality of learned embeddings. We further incorporate a task-specific attention mechanism to enhance the adaptability of task-specific weight for various emerging tasks and ultimately reduce errors in generic representations. Experiments over various text datasets demonstrate that our work achieves superior performance and outperforms the current state-of-the-art methods.
Anthology ID:
2023.findings-emnlp.15
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
194–206
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.15
DOI:
10.18653/v1/2023.findings-emnlp.15
Bibkey:
Cite (ACL):
Yigong Wang, Zhuoyi Wang, Yu Lin, Jinghui Guo, Sadaf Halim, and Latifur Khan. 2023. Dual Contrastive Learning Framework for Incremental Text Classification. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 194–206, Singapore. Association for Computational Linguistics.
Cite (Informal):
Dual Contrastive Learning Framework for Incremental Text Classification (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.15.pdf