Class Incremental Learning for Intent Classification with Limited or No Old Data

Debjit Paul, Daniil Sorokin, Judith Gaspers


Abstract
In this paper, we explore class-incremental learning for intent classification (IC) in a setting with limited old data available. IC is the task of mapping user utterances to their corresponding intents. Even though class-incremental learning without storing the old data yields high potential of reducing human and computational resources in industry NLP model releases, to the best of our knowledge, it hasn’t been studied for NLP classification tasks in the literature before. In this work, we compare several contemporary class-incremental learning methods, i.e., BERT warm start, L2, Elastic Weight Consolidation, RecAdam and Knowledge Distillation within two realistic class-incremental learning scenarios: one where only the previous model is assumed to be available, but no data corresponding to old classes, and one in which limited unlabeled data for old classes is assumed to be available. Our results indicate that among the investigated continual learning methods, Knowledge Distillation worked best for our class-incremental learning tasks, and adding limited unlabeled data helps the model in both adaptability and stability.
Anthology ID:
2022.evonlp-1.4
Volume:
Proceedings of the First Workshop on Ever Evolving NLP (EvoNLP)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Francesco Barbieri, Jose Camacho-Collados, Bhuwan Dhingra, Luis Espinosa-Anke, Elena Gribovskaya, Angeliki Lazaridou, Daniel Loureiro, Leonardo Neves
Venue:
EvoNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16–25
Language:
URL:
https://aclanthology.org/2022.evonlp-1.4
DOI:
10.18653/v1/2022.evonlp-1.4
Bibkey:
Cite (ACL):
Debjit Paul, Daniil Sorokin, and Judith Gaspers. 2022. Class Incremental Learning for Intent Classification with Limited or No Old Data. In Proceedings of the First Workshop on Ever Evolving NLP (EvoNLP), pages 16–25, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Class Incremental Learning for Intent Classification with Limited or No Old Data (Paul et al., EvoNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.evonlp-1.4.pdf