A2N: Attending to Neighbors for Knowledge Graph Inference

Trapit Bansal, Da-Cheng Juan, Sujith Ravi, Andrew McCallum


Abstract
State-of-the-art models for knowledge graph completion aim at learning a fixed embedding representation of entities in a multi-relational graph which can generalize to infer unseen entity relationships at test time. This can be sub-optimal as it requires memorizing and generalizing to all possible entity relationships using these fixed representations. We thus propose a novel attention-based method to learn query-dependent representation of entities which adaptively combines the relevant graph neighborhood of an entity leading to more accurate KG completion. The proposed method is evaluated on two benchmark datasets for knowledge graph completion, and experimental results show that the proposed model performs competitively or better than existing state-of-the-art, including recent methods for explicit multi-hop reasoning. Qualitative probing offers insight into how the model can reason about facts involving multiple hops in the knowledge graph, through the use of neighborhood attention.
Anthology ID:
P19-1431
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4387–4392
Language:
URL:
https://aclanthology.org/P19-1431
DOI:
10.18653/v1/P19-1431
Bibkey:
Cite (ACL):
Trapit Bansal, Da-Cheng Juan, Sujith Ravi, and Andrew McCallum. 2019. A2N: Attending to Neighbors for Knowledge Graph Inference. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4387–4392, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
A2N: Attending to Neighbors for Knowledge Graph Inference (Bansal et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1431.pdf
Video:
 https://vimeo.com/385264668