RAPL: A Relation-Aware Prototype Learning Approach for Few-Shot Document-Level Relation Extraction

Shiao Meng, Xuming Hu, Aiwei Liu, Shuang Li, Fukun Ma, Yawen Yang, Lijie Wen


Abstract
How to identify semantic relations among entities in a document when only a few labeled documents are available? Few-shot document-level relation extraction (FSDLRE) is crucial for addressing the pervasive data scarcity problem in real-world scenarios. Metric-based meta-learning is an effective framework widely adopted for FSDLRE, which constructs class prototypes for classification. However, existing works often struggle to obtain class prototypes with accurate relational semantics: 1) To build prototype for a target relation type, they aggregate the representations of all entity pairs holding that relation, while these entity pairs may also hold other relations, thus disturbing the prototype. 2) They use a set of generic NOTA (none-of-the-above) prototypes across all tasks, neglecting that the NOTA semantics differs in tasks with different target relation types. In this paper, we propose a relation-aware prototype learning method for FSDLRE to strengthen the relational semantics of prototype representations. By judiciously leveraging the relation descriptions and realistic NOTA instances as guidance, our method effectively refines the relation prototypes and generates task-specific NOTA prototypes. Extensive experiments demonstrate that our method outperforms state-of-the-art approaches by average 2.61% F1 across various settings of two FSDLRE benchmarks.
Anthology ID:
2023.emnlp-main.316
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5208–5226
Language:
URL:
https://aclanthology.org/2023.emnlp-main.316
DOI:
10.18653/v1/2023.emnlp-main.316
Bibkey:
Cite (ACL):
Shiao Meng, Xuming Hu, Aiwei Liu, Shuang Li, Fukun Ma, Yawen Yang, and Lijie Wen. 2023. RAPL: A Relation-Aware Prototype Learning Approach for Few-Shot Document-Level Relation Extraction. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5208–5226, Singapore. Association for Computational Linguistics.
Cite (Informal):
RAPL: A Relation-Aware Prototype Learning Approach for Few-Shot Document-Level Relation Extraction (Meng et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.316.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.316.mp4