Knowledge Graph Embeddings in Geometric Algebras

Chengjin Xu, Mojtaba Nayyeri, Yung-Yu Chen, Jens Lehmann


Abstract
Knowledge graph (KG) embedding aims at embedding entities and relations in a KG into a low dimensional latent representation space. Existing KG embedding approaches model entities and relations in a KG by utilizing real-valued , complex-valued, or hypercomplex-valued (Quaternion or Octonion) representations, all of which are subsumed into a geometric algebra. In this work, we introduce a novel geometric algebra-based KG embedding framework, GeomE, which utilizes multivector representations and the geometric product to model entities and relations. Our framework subsumes several state-of-the-art KG embedding approaches and is advantageous with its ability of modeling various key relation patterns, including (anti-)symmetry, inversion and composition, rich expressiveness with higher degree of freedom as well as good generalization capacity. Experimental results on multiple benchmark knowledge graphs show that the proposed approach outperforms existing state-of-the-art models for link prediction.
Anthology ID:
2020.coling-main.46
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
530–544
Language:
URL:
https://aclanthology.org/2020.coling-main.46
DOI:
10.18653/v1/2020.coling-main.46
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.46.pdf