CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks

Zixuan Ke, Bing Liu, Hu Xu, Lei Shu


Abstract
This paper studies continual learning (CL) of a sequence of aspect sentiment classification (ASC) tasks in a particular CL setting called domain incremental learning (DIL). Each task is from a different domain or product. The DIL setting is particularly suited to ASC because in testing the system needs not know the task/domain to which the test data belongs. To our knowledge, this setting has not been studied before for ASC. This paper proposes a novel model called CLASSIC. The key novelty is a contrastive continual learning method that enables both knowledge transfer across tasks and knowledge distillation from old tasks to the new task, which eliminates the need for task ids in testing. Experimental results show the high effectiveness of CLASSIC.
Anthology ID:
2021.emnlp-main.550
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6871–6883
Language:
URL:
https://aclanthology.org/2021.emnlp-main.550
DOI:
10.18653/v1/2021.emnlp-main.550
Bibkey:
Cite (ACL):
Zixuan Ke, Bing Liu, Hu Xu, and Lei Shu. 2021. CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 6871–6883, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks (Ke et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.550.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.550.mp4
Code
 zixuanke/pycontinual