KICGPT: Large Language Model with Knowledge in Context for Knowledge Graph Completion

Yanbin Wei, Qiushi Huang, Yu Zhang, James Kwok


Abstract
Knowledge Graph Completion (KGC) is crucial for addressing knowledge graph incompleteness and supporting downstream applications. Many models have been proposed for KGC and they can be categorized into two main classes, including triple-based and test-based approaches. Triple-based methods struggle with long-tail entities due to limited structural information and imbalanced distributions of entities. Text-based methods alleviate this issue but require costly training for language models and specific finetuning for knowledge graphs, which limits their efficiency. To alleviate the limitations in the two approaches, in this paper, we propose KICGPT, a framework that integrates a large language model (LLM) and a triple-based KGC retriever, to alleviate the long-tail problem without incurring additional training overhead. In the proposed KICGPT model, we propose an in-context learning strategy called Knowledge Prompt, which encodes structural knowledge into demonstrations to guide LLM. Empirical results on benchmark datasets demonstrate the effectiveness of the proposed KICGPT model with lighter training overhead and no finetuning.
Anthology ID:
2023.findings-emnlp.580
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8667–8683
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.580
DOI:
10.18653/v1/2023.findings-emnlp.580
Bibkey:
Cite (ACL):
Yanbin Wei, Qiushi Huang, Yu Zhang, and James Kwok. 2023. KICGPT: Large Language Model with Knowledge in Context for Knowledge Graph Completion. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8667–8683, Singapore. Association for Computational Linguistics.
Cite (Informal):
KICGPT: Large Language Model with Knowledge in Context for Knowledge Graph Completion (Wei et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.580.pdf