Rethinking Document-Level Relation Extraction: A Reality Check

Jing Li, Yequan Wang, Shuai Zhang, Min Zhang


Abstract
Recently, numerous efforts have continued to push up performance boundaries of document-level relation extraction (DocRE) and have claimed significant progress in DocRE. In this paper, we do not aim at proposing a novel model for DocRE. Instead, we take a closer look at the field to see if these performance gains are actually true. By taking a comprehensive literature review and a thorough examination of popular DocRE datasets, we find that these performance gains are achieved upon a strong or even untenable assumption in common: all named entities are perfectly localized, normalized, and typed in advance. Next, we construct four types of entity mention attacks to examine the robustness of typical DocRE models by behavioral probing. We also have a close check on model usability in a more realistic setting. Our findings reveal that most of current DocRE models are vulnerable to entity mention attacks and difficult to be deployed in real-world end-user NLP applications. Our study calls more attentions for future research to stop simplifying problem setups, and to model DocRE in the wild rather than in an unrealistic Utopian world.
Anthology ID:
2023.findings-acl.353
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5715–5730
Language:
URL:
https://aclanthology.org/2023.findings-acl.353
DOI:
10.18653/v1/2023.findings-acl.353
Bibkey:
Cite (ACL):
Jing Li, Yequan Wang, Shuai Zhang, and Min Zhang. 2023. Rethinking Document-Level Relation Extraction: A Reality Check. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5715–5730, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Rethinking Document-Level Relation Extraction: A Reality Check (Li et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.353.pdf