TL-CL: Task And Language Incremental Continual Learning

Shrey Satapara, P. K. Srijith


Abstract
This paper introduces and investigates the problem of Task and Language Incremental Continual Learning (TLCL), wherein a multilingual model is systematically updated to accommodate new tasks in previously learned languages or new languages for established tasks. This significant yet previously unexplored area holds substantial practical relevance as it mirrors the dynamic requirements of real-world applications. We benchmark a representative set of continual learning (CL) algorithms for TLCL. Furthermore, we propose Task and Language-Specific Adapters (TLSA), an adapter-based parameter-efficient fine-tuning strategy. TLSA facilitates cross-lingual and cross-task transfer and outperforms other parameter-efficient fine-tuning techniques. Crucially, TLSA reduces parameter growth stemming from saving adapters to linear complexity from polynomial complexity as it was with parameter isolation-based adapter tuning. We conducted experiments on several NLP tasks arising across several languages. We observed that TLSA outperforms all other parameter-efficient approaches without requiring access to historical data for replay.
Anthology ID:
2024.emnlp-main.676
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12123–12142
Language:
URL:
https://aclanthology.org/2024.emnlp-main.676
DOI:
Bibkey:
Cite (ACL):
Shrey Satapara and P. K. Srijith. 2024. TL-CL: Task And Language Incremental Continual Learning. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 12123–12142, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
TL-CL: Task And Language Incremental Continual Learning (Satapara & Srijith, EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.676.pdf
Software:
 2024.emnlp-main.676.software.zip
Data:
 2024.emnlp-main.676.data.zip