Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion

Wenjie Xu, Ben Liu, Miao Peng, Xu Jia, Min Peng


Abstract
Temporal Knowledge graph completion (TKGC) is a crucial task that involves reasoning at known timestamps to complete the missing part of facts and has attracted more and more attention in recent years. Most existing methods focus on learning representations based on graph neural networks while inaccurately extracting information from timestamps and insufficiently utilizing the implied information in relations. To address these problems, we propose a novel TKGC model, namely Pre-trained Language Model with Prompts for TKGC (PPT). We convert a series of sampled quadruples into pre-trained language model inputs and convert intervals between timestamps into different prompts to make coherent sentences with implicit semantic information. We train our model with a masking strategy to convert TKGC task into a masked token prediction task, which can leverage the semantic information in pre-trained language models. Experiments on three benchmark datasets and extensive analysis demonstrate that our model has great competitiveness compared to other models with four metrics. Our model can effectively incorporate information from temporal knowledge graphs into the language models.
Anthology ID:
2023.findings-acl.493
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7790–7803
Language:
URL:
https://aclanthology.org/2023.findings-acl.493
DOI:
10.18653/v1/2023.findings-acl.493
Bibkey:
Cite (ACL):
Wenjie Xu, Ben Liu, Miao Peng, Xu Jia, and Min Peng. 2023. Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7790–7803, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.493.pdf