Relation-Specific Attentions over Entity Mentions for Enhanced Document-Level Relation Extraction

Jiaxin Yu, Deqing Yang, Shuyu Tian


Abstract
Compared with traditional sentence-level relation extraction, document-level relation extraction is a more challenging task where an entity in a document may be mentioned multiple times and associated with multiple relations. However, most methods of document-level relation extraction do not distinguish between mention-level features and entity-level features, and just apply simple pooling operation for aggregating mention-level features into entity-level features. As a result, the distinct semantics between the different mentions of an entity are overlooked. To address this problem, we propose RSMAN in this paper which performs selective attentions over different entity mentions with respect to candidate relations. In this manner, the flexible and relation-specific representations of entities are obtained which indeed benefit relation classification. Our extensive experiments upon two benchmark datasets show that our RSMAN can bring significant improvements for some backbone models to achieve state-of-the-art performance, especially when an entity have multiple mentions in the document.
Anthology ID:
2022.naacl-main.109
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1523–1529
Language:
URL:
https://aclanthology.org/2022.naacl-main.109
DOI:
10.18653/v1/2022.naacl-main.109
Bibkey:
Cite (ACL):
Jiaxin Yu, Deqing Yang, and Shuyu Tian. 2022. Relation-Specific Attentions over Entity Mentions for Enhanced Document-Level Relation Extraction. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1523–1529, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Relation-Specific Attentions over Entity Mentions for Enhanced Document-Level Relation Extraction (Yu et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.109.pdf
Video:
 https://aclanthology.org/2022.naacl-main.109.mp4
Code
 fduyjx/rsman
Data
DWIEDocRED