Improving Continual Relation Extraction through Prototypical Contrastive Learning

Chengwei Hu, Deqing Yang, Haoliang Jin, Zhen Chen, Yanghua Xiao


Abstract
Continual relation extraction (CRE) aims to extract relations towards the continuous and iterative arrival of new data, of which the major challenge is the catastrophic forgetting of old tasks. In order to alleviate this critical problem for enhanced CRE performance, we propose a novel Continual Relation Extraction framework with Contrastive Learning, namely CRECL, which is built with a classification network and a prototypical contrastive network to achieve the incremental-class learning of CRE. Specifically, in the contrastive network a given instance is contrasted with the prototype of each candidate relations stored in the memory module. Such contrastive learning scheme ensures the data distributions of all tasks more distinguishable, so as to alleviate the catastrophic forgetting further. Our experiment results not only demonstrate our CRECL’s advantage over the state-of-the-art baselines on two public datasets, but also verify the effectiveness of CRECL’s contrastive learning on improving performance.
Anthology ID:
2022.coling-1.163
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1885–1895
Language:
URL:
https://aclanthology.org/2022.coling-1.163
DOI:
Bibkey:
Cite (ACL):
Chengwei Hu, Deqing Yang, Haoliang Jin, Zhen Chen, and Yanghua Xiao. 2022. Improving Continual Relation Extraction through Prototypical Contrastive Learning. In Proceedings of the 29th International Conference on Computational Linguistics, pages 1885–1895, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Improving Continual Relation Extraction through Prototypical Contrastive Learning (Hu et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.163.pdf
Data
FewRel