Document-Level Relation Extraction with Sentences Importance Estimation and Focusing

Wang Xu, Kehai Chen, Lili Mou, Tiejun Zhao


Abstract
Document-level relation extraction (DocRE) aims to determine the relation between two entities from a document of multiple sentences. Recent studies typically represent the entire document by sequence- or graph-based models to predict the relations of all entity pairs. However, we find that such a model is not robust and exhibits bizarre behaviors: it predicts correctly when an entire test document is fed as input, but errs when non-evidence sentences are removed. To this end, we propose a Sentence Importance Estimation and Focusing (SIEF) framework for DocRE, where we design a sentence importance score and a sentence focusing loss, encouraging DocRE models to focus on evidence sentences. Experimental results on two domains show that our SIEF not only improves overall performance, but also makes DocRE models more robust. Moreover, SIEF is a general framework, shown to be effective when combined with a variety of base DocRE models.
Anthology ID:
2022.naacl-main.212
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2920–2929
Language:
URL:
https://aclanthology.org/2022.naacl-main.212
DOI:
10.18653/v1/2022.naacl-main.212
Bibkey:
Cite (ACL):
Wang Xu, Kehai Chen, Lili Mou, and Tiejun Zhao. 2022. Document-Level Relation Extraction with Sentences Importance Estimation and Focusing. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 2920–2929, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Document-Level Relation Extraction with Sentences Importance Estimation and Focusing (Xu et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.212.pdf
Code
 xwjim/sief
Data
DialogREDocRED