TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion

Jiapeng Wu, Meng Cao, Jackie Chi Kit Cheung, William L. Hamilton


Abstract
Inferring missing facts in temporal knowledge graphs (TKGs) is a fundamental and challenging task. Previous works have approached this problem by augmenting methods for static knowledge graphs to leverage time-dependent representations. However, these methods do not explicitly leverage multi-hop structural information and temporal facts from recent time steps to enhance their predictions. Additionally, prior work does not explicitly address the temporal sparsity and variability of entity distributions in TKGs. We propose the Temporal Message Passing (TeMP) framework to address these challenges by combining graph neural networks, temporal dynamics models, data imputation and frequency-based gating techniques. Experiments on standard TKG tasks show that our approach provides substantial gains compared to the previous state of the art, achieving a 10.7% average relative improvement in Hits@10 across three standard benchmarks. Our analysis also reveals important sources of variability both within and across TKG datasets, and we introduce several simple but strong baselines that outperform the prior state of the art in certain settings.
Anthology ID:
2020.emnlp-main.462
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5730–5746
Language:
URL:
https://aclanthology.org/2020.emnlp-main.462
DOI:
10.18653/v1/2020.emnlp-main.462
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.462.pdf
Video:
 https://slideslive.com/38939266
Code
 JiapengWu/TeMP