Inductively Representing Out-of-Knowledge-Graph Entities by Optimal Estimation Under Translational Assumptions

Damai Dai, Hua Zheng, Fuli Luo, Pengcheng Yang, Tianyu Liu, Zhifang Sui, Baobao Chang


Abstract
Conventional Knowledge Graph Completion (KGC) assumes that all test entities appear during training. However, in real-world scenarios, Knowledge Graphs (KG) evolve fast with out-of-knowledge-graph (OOKG) entities added frequently, and we need to efficiently represent these entities. Most existing Knowledge Graph Embedding (KGE) methods cannot represent OOKG entities without costly retraining on the whole KG. To enhance efficiency, we propose a simple and effective method that inductively represents OOKG entities by their optimal estimation under translational assumptions. Moreover, given pretrained embeddings of the in-knowledge-graph (IKG) entities, our method even needs no additional learning. Experimental results on two KGC tasks with OOKG entities show that our method outperforms the previous methods by a large margin with higher efficiency.
Anthology ID:
2021.repl4nlp-1.10
Volume:
Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Anna Rogers, Iacer Calixto, Ivan Vulić, Naomi Saphra, Nora Kassner, Oana-Maria Camburu, Trapit Bansal, Vered Shwartz
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
83–89
Language:
URL:
https://aclanthology.org/2021.repl4nlp-1.10
DOI:
10.18653/v1/2021.repl4nlp-1.10
Bibkey:
Cite (ACL):
Damai Dai, Hua Zheng, Fuli Luo, Pengcheng Yang, Tianyu Liu, Zhifang Sui, and Baobao Chang. 2021. Inductively Representing Out-of-Knowledge-Graph Entities by Optimal Estimation Under Translational Assumptions. In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021), pages 83–89, Online. Association for Computational Linguistics.
Cite (Informal):
Inductively Representing Out-of-Knowledge-Graph Entities by Optimal Estimation Under Translational Assumptions (Dai et al., RepL4NLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.repl4nlp-1.10.pdf
Video:
 https://aclanthology.org/2021.repl4nlp-1.10.mp4