Temporal Knowledge Graph Completion using a Linear Temporal Regularizer and Multivector Embeddings

Chengjin Xu, Yung-Yu Chen, Mojtaba Nayyeri, Jens Lehmann


Abstract
Representation learning approaches for knowledge graphs have been mostly designed for static data. However, many knowledge graphs involve evolving data, e.g., the fact (The President of the United States is Barack Obama) is valid only from 2009 to 2017. This introduces important challenges for knowledge representation learning since the knowledge graphs change over time. In this paper, we present a novel time-aware knowledge graph embebdding approach, TeLM, which performs 4th-order tensor factorization of a Temporal knowledge graph using a Linear temporal regularizer and Multivector embeddings. Moreover, we investigate the effect of the temporal dataset’s time granularity on temporal knowledge graph completion. Experimental results demonstrate that our proposed models trained with the linear temporal regularizer achieve the state-of-the-art performances on link prediction over four well-established temporal knowledge graph completion benchmarks.
Anthology ID:
2021.naacl-main.202
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2569–2578
Language:
URL:
https://aclanthology.org/2021.naacl-main.202
DOI:
10.18653/v1/2021.naacl-main.202
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.202.pdf