Document-Level Relation Extraction via Pair-Aware and Entity-Enhanced Representation Learning

Xiusheng Huang, Hang Yang, Yubo Chen, Jun Zhao, Kang Liu, Weijian Sun, Zuyu Zhao


Abstract
Document-level relation extraction aims to recognize relations among multiple entity pairs from a whole piece of article. Recent methods achieve considerable performance but still suffer from two challenges: a) the relational entity pairs are sparse, b) the representation of entity pairs is insufficient. In this paper, we propose Pair-Aware and Entity-Enhanced(PAEE) model to solve the aforementioned two challenges. For the first challenge, we design a Pair-Aware Representation module to predict potential relational entity pairs, which constrains the relation extraction to the predicted entity pairs subset rather than all pairs; For the second, we introduce a Entity-Enhanced Representation module to assemble directional entity pairs and obtain a holistic understanding of the entire document. Experimental results show that our approach can obtain state-of-the-art performance on four benchmark datasets DocRED, DWIE, CDR and GDA.
Anthology ID:
2022.coling-1.213
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2418–2428
Language:
URL:
https://aclanthology.org/2022.coling-1.213
DOI:
Bibkey:
Cite (ACL):
Xiusheng Huang, Hang Yang, Yubo Chen, Jun Zhao, Kang Liu, Weijian Sun, and Zuyu Zhao. 2022. Document-Level Relation Extraction via Pair-Aware and Entity-Enhanced Representation Learning. In Proceedings of the 29th International Conference on Computational Linguistics, pages 2418–2428, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Document-Level Relation Extraction via Pair-Aware and Entity-Enhanced Representation Learning (Huang et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.213.pdf
Data
DWIEDocRED