Parameter-efficient Continual Learning Framework in Industrial Real-time Text Classification System

Tao Zhu, Zhe Zhao, Weijie Liu, Jiachi Liu, Yiren Chen, Weiquan Mao, Haoyan Liu, Kunbo Ding, Yudong Li, Xuefeng Yang


Abstract
Catastrophic forgetting is a challenge for model deployment in industrial real-time systems, which requires the model to quickly master a new task without forgetting the old one. Continual learning aims to solve this problem; however, it usually updates all the model parameters, resulting in extensive training times and the inability to deploy quickly. To address this challenge, we propose a parameter-efficient continual learning framework, in which efficient parameters are selected through an offline parameter selection strategy and then trained using an online regularization method. In our framework, only a few parameters need to be updated, which not only alleviates catastrophic forgetting, but also allows the model to be saved with the changed parameters instead of all parameters. Extensive experiments are conducted to examine the effectiveness of our proposal. We believe this paper will provide useful insights and experiences on developing deep learning-based online real-time systems.
Anthology ID:
2022.naacl-industry.35
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Track
Month:
July
Year:
2022
Address:
Hybrid: Seattle, Washington + Online
Editors:
Anastassia Loukina, Rashmi Gangadharaiah, Bonan Min
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
315–323
Language:
URL:
https://aclanthology.org/2022.naacl-industry.35
DOI:
10.18653/v1/2022.naacl-industry.35
Bibkey:
Cite (ACL):
Tao Zhu, Zhe Zhao, Weijie Liu, Jiachi Liu, Yiren Chen, Weiquan Mao, Haoyan Liu, Kunbo Ding, Yudong Li, and Xuefeng Yang. 2022. Parameter-efficient Continual Learning Framework in Industrial Real-time Text Classification System. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Track, pages 315–323, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.
Cite (Informal):
Parameter-efficient Continual Learning Framework in Industrial Real-time Text Classification System (Zhu et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-industry.35.pdf
Video:
 https://aclanthology.org/2022.naacl-industry.35.mp4