Domain Incremental Lifelong Learning in an Open World

Yi Dai, Hao Lang, Yinhe Zheng, Bowen Yu, Fei Huang, Yongbin Li


Abstract
Lifelong learning (LL) is an important ability for NLP models to learn new tasks continuously. Architecture-based approaches are reported to be effective implementations for LL models. However, it is non-trivial to extend previous approaches to domain incremental LL scenarios since they either require access to task identities in the testing phase or cannot handle samples from unseen tasks. In this paper, we propose Diana: a dynamic architecture-based lifelong learning model that tries to learn a sequence of tasks with a prompt-enhanced language model. Four types of hierarchically organized prompts are used in Diana to capture knowledge from different granularities. Specifically, we dedicate task-level prompts to capture task-specific knowledge to retain high LL performances and maintain instance-level prompts to learn knowledge shared across input samples to improve the model’s generalization performance. Moreover, we dedicate separate prompts to explicitly model unseen tasks and introduce a set of prompt key vectors to facilitate knowledge sharing between tasks. Extensive experiments demonstrate that Diana outperforms state-of-the-art LL models, especially in handling unseen tasks.
Anthology ID:
2023.findings-acl.361
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5844–5865
Language:
URL:
https://aclanthology.org/2023.findings-acl.361
DOI:
10.18653/v1/2023.findings-acl.361
Bibkey:
Cite (ACL):
Yi Dai, Hao Lang, Yinhe Zheng, Bowen Yu, Fei Huang, and Yongbin Li. 2023. Domain Incremental Lifelong Learning in an Open World. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5844–5865, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Domain Incremental Lifelong Learning in an Open World (Dai et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.361.pdf