Improving Continual Few-shot Relation Extraction through Relational Knowledge Distillation and Prototype Augmentation

Zhiheng Zhang, Daojian Zeng, Xue Bai


Abstract
In this paper, we focus on the challenging yet practical problem of Continual Few-shot Relation Extraction (CFRE), which involves extracting relations in the continuous and iterative arrival of new data with only a few labeled examples. The main challenges in CFRE are overfitting due to few-shot learning and catastrophic forgetting caused by continual learning. To address these problems, we propose a novel framework called RK2DA, which seamlessly integrates prototype-based data augmentation and relational knowledge distillation. Specifically, RK2DA generates pseudo data by introducing Gaussian noise to the prototype embeddings and utilizes a novel two-phase multi-teacher relational knowledge distillation method to transfer various knowledge from different embedding spaces. Experimental results on the FewRel and TACRED datasets demonstrate that our method outperforms the state-of-the-art baselines.
Anthology ID:
2024.lrec-main.767
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
8756–8767
Language:
URL:
https://aclanthology.org/2024.lrec-main.767
DOI:
Bibkey:
Cite (ACL):
Zhiheng Zhang, Daojian Zeng, and Xue Bai. 2024. Improving Continual Few-shot Relation Extraction through Relational Knowledge Distillation and Prototype Augmentation. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 8756–8767, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Improving Continual Few-shot Relation Extraction through Relational Knowledge Distillation and Prototype Augmentation (Zhang et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.767.pdf