RulE: Knowledge Graph Reasoning with Rule Embedding

Xiaojuan Tang, Song-Chun Zhu, Yitao Liang, Muhan Zhang


Abstract
Knowledge graph reasoning is an important problem for knowledge graphs. In this paper, we propose a novel and principled framework called RulE (stands for Rule Embedding) to effectively leverage logical rules to enhance KG reasoning. Unlike knowledge graph embedding methods, RulE learns rule embeddings from existing triplets and first-order rules by jointly representing entities, relations and logical rules in a unified embedding space. Based on the learned rule embeddings, a confidence score can be calculated for each rule, reflecting its consistency with the observed triplets. This allows us to perform logical rule inference in a soft way, thus alleviating the brittleness of logic. On the other hand, RulE injects prior logical rule information into the embedding space, enriching and regularizing the entity/relation embeddings. This makes KGE alone perform better too. RulE is conceptually simple and empirically effective. We conduct extensive experiments to verify each component of RulE.Results on multiple benchmarks reveal that our model outperforms the majority of existing embedding-based and rule-based approaches.
Anthology ID:
2024.findings-acl.256
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4316–4335
Language:
URL:
https://aclanthology.org/2024.findings-acl.256
DOI:
Bibkey:
Cite (ACL):
Xiaojuan Tang, Song-Chun Zhu, Yitao Liang, and Muhan Zhang. 2024. RulE: Knowledge Graph Reasoning with Rule Embedding. In Findings of the Association for Computational Linguistics ACL 2024, pages 4316–4335, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
RulE: Knowledge Graph Reasoning with Rule Embedding (Tang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.256.pdf