Preserving Generalization of Language models in Few-shot Continual Relation Extraction

Quyen Tran, Nguyen Xuan Thanh, Nguyen Hoang Anh, Nam Le Hai, Trung Le, Linh Van Ngo, Thien Huu Nguyen


Abstract
Few-shot Continual Relations Extraction (FCRE) is an emerging and dynamic area of study where models can sequentially integrate knowledge from new relations with limited labeled data while circumventing catastrophic forgetting and preserving prior knowledge from pre-trained backbones. In this work, we introduce a novel method that leverages often-discarded language model heads. By employing these components via a mutual information maximization strategy, our approach helps maintain prior knowledge from the pre-trained backbone and strategically aligns the primary classification head, thereby enhancing model performance. Furthermore, we explore the potential of Large Language Models (LLMs), renowned for their wealth of knowledge, in addressing FCRE challenges. Our comprehensive experimental results underscore the efficacy of the proposed method and offer valuable insights for future work.
Anthology ID:
2024.emnlp-main.763
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13771–13784
Language:
URL:
https://aclanthology.org/2024.emnlp-main.763
DOI:
10.18653/v1/2024.emnlp-main.763
Bibkey:
Cite (ACL):
Quyen Tran, Nguyen Xuan Thanh, Nguyen Hoang Anh, Nam Le Hai, Trung Le, Linh Van Ngo, and Thien Huu Nguyen. 2024. Preserving Generalization of Language models in Few-shot Continual Relation Extraction. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 13771–13784, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Preserving Generalization of Language models in Few-shot Continual Relation Extraction (Tran et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.763.pdf
Software:
 2024.emnlp-main.763.software.zip