Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning

Xisen Jin, Bill Yuchen Lin, Mohammad Rostami, Xiang Ren


Abstract
The ability to continuously expand knowledge over time and utilize it to rapidly generalize to new tasks is a key feature of human linguistic intelligence. Existing models that pursue rapid generalization to new tasks (e.g., few-shot learning methods), however, are mostly trained in a single shot on fixed datasets, unable to dynamically expand their knowledge; while continual learning algorithms are not specifically designed for rapid generalization. We present a new learning setup, Continual Learning of Few-Shot Learners (CLIF), to address challenges of both learning settings in a unified setup. CLIF assumes a model learns from a sequence of diverse NLP tasks arriving sequentially, accumulating knowledge for improved generalization to new tasks, while also retaining performance on the tasks learned earlier. We examine how the generalization ability is affected in the continual learning setup, evaluate a number of continual learning algorithms, and propose a novel regularized adapter generation approach. We find that catastrophic forgetting affects generalization ability to a lesser degree than performance on seen tasks; while continual learning algorithms can still bring considerable benefit to the generalization ability.
Anthology ID:
2021.findings-emnlp.62
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
714–729
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.62
DOI:
10.18653/v1/2021.findings-emnlp.62
Bibkey:
Cite (ACL):
Xisen Jin, Bill Yuchen Lin, Mohammad Rostami, and Xiang Ren. 2021. Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 714–729, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning (Jin et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.62.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.62.mp4
Code
 INK-USC/CLIF
Data
CARERGLUE