Is Parameter Collision Hindering Continual Learning in LLMs?

Shuo Yang, Kun-Peng Ning, Yu-Yang Liu, Jia-Yu Yao, Yong-Hong Tian, Yi-Bing Song, Li Yuan


Abstract
Large Language Models (LLMs) often suffer from catastrophic forgetting when learning multiple tasks sequentially, making continual learning (CL) essential for their dynamic deployment. Existing state-of-the-art (SOTA) methods, such as O-LoRA, typically focus on constructing orthogonality tasks to decouple parameter interdependence from various domains.In this paper, we reveal that building non-collision parameters is a more critical factor in addressing CL challenges. Our theoretical and experimental analyses demonstrate that non-collision parameters provide better task orthogonality, which is a sufficient but unnecessary condition. Furthermore, knowledge from multiple domains will be preserved in non-collision parameter subspaces, making it more difficult to forget previously seen data. Leveraging this insight, we propose Non-collision Low-Rank Adaptation (N-LoRA), a simple yet effective approach leveraging low collision rates to enhance CL in LLMs. Experimental results on multiple CL benchmarks indicate that N-LoRA achieves superior performance (+2.9%), higher task orthogonality (×4.1times), and lower parameter collision (×58.1times) than SOTA methods.
Anthology ID:
2025.coling-main.286
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4243–4259
Language:
URL:
https://aclanthology.org/2025.coling-main.286/
DOI:
Bibkey:
Cite (ACL):
Shuo Yang, Kun-Peng Ning, Yu-Yang Liu, Jia-Yu Yao, Yong-Hong Tian, Yi-Bing Song, and Li Yuan. 2025. Is Parameter Collision Hindering Continual Learning in LLMs?. In Proceedings of the 31st International Conference on Computational Linguistics, pages 4243–4259, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Is Parameter Collision Hindering Continual Learning in LLMs? (Yang et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.286.pdf