Entity and Evidence Guided Document-Level Relation Extraction

Kevin Huang, Peng Qi, Guangtao Wang, Tengyu Ma, Jing Huang


Abstract
Document-level relation extraction is a challenging task, requiring reasoning over multiple sentences to predict a set of relations in a document. In this paper, we propose a novel framework E2GRE (Entity and Evidence Guided Relation Extraction) that jointly extracts relations and the underlying evidence sentences by using large pretrained language model (LM) as input encoder. First, we propose to guide the pretrained LM’s attention mechanism to focus on relevant context by using attention probabilities as additional features for evidence prediction. Furthermore, instead of feeding the whole document into pretrained LMs to obtain entity representation, we concatenate document text with head entities to help LMs concentrate on parts of the document that are more related to the head entity. Our E2GRE jointly learns relation extraction and evidence prediction effectively, showing large gains on both these tasks, which we find are highly correlated.
Anthology ID:
2021.repl4nlp-1.30
Volume:
Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Anna Rogers, Iacer Calixto, Ivan Vulić, Naomi Saphra, Nora Kassner, Oana-Maria Camburu, Trapit Bansal, Vered Shwartz
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
307–315
Language:
URL:
https://aclanthology.org/2021.repl4nlp-1.30
DOI:
10.18653/v1/2021.repl4nlp-1.30
Bibkey:
Cite (ACL):
Kevin Huang, Peng Qi, Guangtao Wang, Tengyu Ma, and Jing Huang. 2021. Entity and Evidence Guided Document-Level Relation Extraction. In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021), pages 307–315, Online. Association for Computational Linguistics.
Cite (Informal):
Entity and Evidence Guided Document-Level Relation Extraction (Huang et al., RepL4NLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.repl4nlp-1.30.pdf
Optional supplementary material:
 2021.repl4nlp-1.30.OptionalSupplementaryMaterial.pdf
Data
DocRED