Modular Self-Supervision for Document-Level Relation Extraction

Sheng Zhang, Cliff Wong, Naoto Usuyama, Sarthak Jain, Tristan Naumann, Hoifung Poon


Abstract
Extracting relations across large text spans has been relatively underexplored in NLP, but it is particularly important for high-value domains such as biomedicine, where obtaining high recall of the latest findings is crucial for practical applications. Compared to conventional information extraction confined to short text spans, document-level relation extraction faces additional challenges in both inference and learning. Given longer text spans, state-of-the-art neural architectures are less effective and task-specific self-supervision such as distant supervision becomes very noisy. In this paper, we propose decomposing document-level relation extraction into relation detection and argument resolution, taking inspiration from Davidsonian semantics. This enables us to incorporate explicit discourse modeling and leverage modular self-supervision for each sub-problem, which is less noise-prone and can be further refined end-to-end via variational EM. We conduct a thorough evaluation in biomedical machine reading for precision oncology, where cross-paragraph relation mentions are prevalent. Our method outperforms prior state of the art, such as multi-scale learning and graph neural networks, by over 20 absolute F1 points. The gain is particularly pronounced among the most challenging relation instances whose arguments never co-occur in a paragraph.
Anthology ID:
2021.emnlp-main.429
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5291–5302
Language:
URL:
https://aclanthology.org/2021.emnlp-main.429
DOI:
10.18653/v1/2021.emnlp-main.429
Bibkey:
Cite (ACL):
Sheng Zhang, Cliff Wong, Naoto Usuyama, Sarthak Jain, Tristan Naumann, and Hoifung Poon. 2021. Modular Self-Supervision for Document-Level Relation Extraction. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5291–5302, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Modular Self-Supervision for Document-Level Relation Extraction (Zhang et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.429.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.429.mp4
Data
DocRED