In-context Continual Learning Assisted by an External Continual Learner

Saleh Momeni, Sahisnu Mazumder, Zixuan Ke, Bing Liu


Abstract
Existing continual learning (CL) methods mainly rely on fine-tuning or adapting large language models (LLMs). They still suffer from catastrophic forgetting (CF). Little work has been done to exploit in-context learning (ICL) to leverage the extensive knowledge within LLMs for CL without updating any parameters. However, incrementally learning each new task in ICL necessitates adding training examples from each class of the task to the prompt, which hampers scalability as the prompt length increases. This issue not only leads to excessively long prompts that exceed the input token limit of the underlying LLM but also degrades the model’s performance due to the overextended context. To address this, we introduce InCA, a novel approach that integrates an external continual learner (ECL) with ICL to enable scalable CL without CF. The ECL is built incrementally to pre-select a small subset of likely classes for each test instance. By restricting the ICL prompt to only these selected classes, InCA prevents prompt lengths from becoming excessively long, while maintaining high performance. Experimental results demonstrate that InCA significantly outperforms existing CL baselines, achieving substantial performance gains.
Anthology ID:
2025.coling-main.487
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7292–7306
Language:
URL:
https://aclanthology.org/2025.coling-main.487/
DOI:
Bibkey:
Cite (ACL):
Saleh Momeni, Sahisnu Mazumder, Zixuan Ke, and Bing Liu. 2025. In-context Continual Learning Assisted by an External Continual Learner. In Proceedings of the 31st International Conference on Computational Linguistics, pages 7292–7306, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
In-context Continual Learning Assisted by an External Continual Learner (Momeni et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.487.pdf