Incremental Prompting: Episodic Memory Prompt for Lifelong Event Detection

Minqian Liu, Shiyu Chang, Lifu Huang


Abstract
Lifelong event detection aims to incrementally update a model with new event types and data while retaining the capability on previously learned old types. One critical challenge is that the model would catastrophically forget old types when continually trained on new data. In this paper, we introduce Episodic Memory Prompts (EMP) to explicitly retain the learned task-specific knowledge. Our method adopts continuous prompt for each task and they are optimized to instruct the model prediction and learn event-specific representation. The EMPs learned in previous tasks are carried along with the model in subsequent tasks, and can serve as a memory module that keeps the old knowledge and transferring to new tasks. Experiment results demonstrate the effectiveness of our method. Furthermore, we also conduct a comprehensive analysis of the new and old event types in lifelong learning.
Anthology ID:
2022.coling-1.189
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2157–2165
Language:
URL:
https://aclanthology.org/2022.coling-1.189
DOI:
Bibkey:
Cite (ACL):
Minqian Liu, Shiyu Chang, and Lifu Huang. 2022. Incremental Prompting: Episodic Memory Prompt for Lifelong Event Detection. In Proceedings of the 29th International Conference on Computational Linguistics, pages 2157–2165, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Incremental Prompting: Episodic Memory Prompt for Lifelong Event Detection (Liu et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.189.pdf
Code
 vt-nlp/incremental_prompting
Data
MAVEN