Attention Guided Graph Convolutional Networks for Relation Extraction

Zhijiang Guo, Yan Zhang, Wei Lu


Abstract
Dependency trees convey rich structural information that is proven useful for extracting relations among entities in text. However, how to effectively make use of relevant information while ignoring irrelevant information from the dependency trees remains a challenging research question. Existing approaches employing rule based hard-pruning strategies for selecting relevant partial dependency structures may not always yield optimal results. In this work, we propose Attention Guided Graph Convolutional Networks (AGGCNs), a novel model which directly takes full dependency trees as inputs. Our model can be understood as a soft-pruning approach that automatically learns how to selectively attend to the relevant sub-structures useful for the relation extraction task. Extensive results on various tasks including cross-sentence n-ary relation extraction and large-scale sentence-level relation extraction show that our model is able to better leverage the structural information of the full dependency trees, giving significantly better results than previous approaches.
Anthology ID:
P19-1024
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
241–251
Language:
URL:
https://aclanthology.org/P19-1024
DOI:
10.18653/v1/P19-1024
Bibkey:
Cite (ACL):
Zhijiang Guo, Yan Zhang, and Wei Lu. 2019. Attention Guided Graph Convolutional Networks for Relation Extraction. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 241–251, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Attention Guided Graph Convolutional Networks for Relation Extraction (Guo et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1024.pdf
Video:
 https://vimeo.com/383992004
Code
 Cartus/AGGCN_TACRED +  additional community code
Data
TACRED