One-Shot Relational Learning for Knowledge Graphs

Wenhan Xiong, Mo Yu, Shiyu Chang, Xiaoxiao Guo, William Yang Wang


Abstract
Knowledge graphs (KG) are the key components of various natural language processing applications. To further expand KGs’ coverage, previous studies on knowledge graph completion usually require a large number of positive examples for each relation. However, we observe long-tail relations are actually more common in KGs and those newly added relations often do not have many known triples for training. In this work, we aim at predicting new facts under a challenging setting where only one training instance is available. We propose a one-shot relational learning framework, which utilizes the knowledge distilled by embedding models and learns a matching metric by considering both the learned embeddings and one-hop graph structures. Empirically, our model yields considerable performance improvements over existing embedding models, and also eliminates the need of re-training the embedding models when dealing with newly added relations.
Anthology ID:
D18-1223
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1980–1990
Language:
URL:
https://aclanthology.org/D18-1223
DOI:
10.18653/v1/D18-1223
Bibkey:
Cite (ACL):
Wenhan Xiong, Mo Yu, Shiyu Chang, Xiaoxiao Guo, and William Yang Wang. 2018. One-Shot Relational Learning for Knowledge Graphs. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1980–1990, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
One-Shot Relational Learning for Knowledge Graphs (Xiong et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1223.pdf
Attachment:
 D18-1223.Attachment.pdf
Code
 xwhan/One-shot-Relational-Learning
Data
Wiki-One