Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph Completion

Kunze Wang, Caren Han, Josiah Poon


Abstract
Temporal Knowledge Graph Completion (TKGC) under the extrapolation setting aims to predict the missing entity from a fact in the future, posing a challenge that aligns more closely with real-world prediction problems. Existing research mostly encodes entities and relations using sequential graph neural networks applied to recent snapshots. However, these approaches tend to overlook the ability to skip irrelevant snapshots according to entity-related relations in the query and disregard the importance of explicit temporal information. To address this, we propose our model, Re-Temp (Relation-Aware Temporal Representation Learning), which leverages explicit temporal embedding as input and incorporates skip information flow after each timestamp to skip unnecessary information for prediction. Additionally, we introduce a two-phase forward propagation method to prevent information leakage. Through the evaluation on six TKGC (extrapolation) datasets, we demonstrate that our model outperforms all eight recent state-of-the-art models by a significant margin.
Anthology ID:
2023.findings-emnlp.20
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
258–269
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.20
DOI:
10.18653/v1/2023.findings-emnlp.20
Bibkey:
Cite (ACL):
Kunze Wang, Caren Han, and Josiah Poon. 2023. Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph Completion. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 258–269, Singapore. Association for Computational Linguistics.
Cite (Informal):
Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph Completion (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.20.pdf