Dependency Position Encoding for Relation Extraction

Qiushi Guo, Xin Wang, Dehong Gao


Abstract
Leveraging the dependency tree of the input sentence is able to improve the model performance for relation extraction. A challenging issue is how to remove confusions from the tree. Efforts have been made to utilize the dependency connections between words to selectively emphasize target-relevant information. However, these approaches are limited in focusing on exploiting dependency types. In this paper, we propose dependency position encoding (DPE), an efficient way of incorporating both dependency connections and dependency types into the self-attention mechanism to distinguish the importance of different word dependencies for the task. In contrast to previous studies that process input sentence and dependency information in separate streams, DPE can be seamlessly incorporated into the Transformer and makes it possible to use an one-stream scheme to extract relations between entity pairs. Extensive experiments show that models with our DPE significantly outperform the previous methods on SemEval 2010 Task 8, KBP37, and TACRED.
Anthology ID:
2022.findings-naacl.120
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1601–1606
Language:
URL:
https://aclanthology.org/2022.findings-naacl.120
DOI:
10.18653/v1/2022.findings-naacl.120
Bibkey:
Cite (ACL):
Qiushi Guo, Xin Wang, and Dehong Gao. 2022. Dependency Position Encoding for Relation Extraction. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1601–1606, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Dependency Position Encoding for Relation Extraction (Guo et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.120.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.120.mp4
Data
TACRED