SKD-NER: Continual Named Entity Recognition via Span-based Knowledge Distillation with Reinforcement Learning

Yi Chen, Liang He


Abstract
Continual learning for named entity recognition (CL-NER) aims to enable models to continuously learn new entity types while retaining the ability to recognize previously learned ones. However, the current strategies fall short of effectively addressing the catastrophic forgetting of previously learned entity types. To tackle this issue, we propose the SKD-NER model, an efficient continual learning NER model based on the span-based approach, which innovatively incorporates reinforcement learning strategies to enhance the model’s ability against catastrophic forgetting. Specifically, we leverage knowledge distillation (KD) to retain memory and employ reinforcement learning strategies during the KD process to optimize the soft labeling and distillation losses generated by the teacher model to effectively prevent catastrophic forgetting during continual learning. This approach effectively prevents or mitigates catastrophic forgetting during continuous learning, allowing the model to retain previously learned knowledge while acquiring new knowledge. Our experiments on two benchmark datasets demonstrate that our model significantly improves the performance of the CL-NER task, outperforming state-of-the-art methods.
Anthology ID:
2023.emnlp-main.413
Original:
2023.emnlp-main.413v1
Version 2:
2023.emnlp-main.413v2
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6689–6700
Language:
URL:
https://aclanthology.org/2023.emnlp-main.413
DOI:
10.18653/v1/2023.emnlp-main.413
Bibkey:
Cite (ACL):
Yi Chen and Liang He. 2023. SKD-NER: Continual Named Entity Recognition via Span-based Knowledge Distillation with Reinforcement Learning. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 6689–6700, Singapore. Association for Computational Linguistics.
Cite (Informal):
SKD-NER: Continual Named Entity Recognition via Span-based Knowledge Distillation with Reinforcement Learning (Chen & He, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.413.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.413.mp4