Filter-then-Generate: Large Language Models with Structure-Text Adapter for Knowledge Graph Completion

Ben Liu, Jihai Zhang, Fangquan Lin, Cheng Yang, Min Peng


Abstract
Large Language Models (LLMs) present massive inherent knowledge and superior semantic comprehension capability, which have revolutionized various tasks in natural language processing. Despite their success, a critical gap remains in enabling LLMs to perform knowledge graph completion (KGC). Empirical evidence suggests that LLMs consistently perform worse than conventional KGC approaches, even through sophisticated prompt design or tailored instruction-tuning. Fundamentally, applying LLMs on KGC introduces several critical challenges, including a vast set of entity candidates, hallucination issue of LLMs, and under-exploitation of the graph structure. To address these challenges, we propose a novel instruction-tuning-based method, namely FtG. Specifically, we present a filter-then-generate paradigm and formulate the KGC task into a multiple-choice question format. In this way, we can harness the capability of LLMs while mitigating the issue casused by hallucinations. Moreover, we devise a flexible ego-graph serialization prompt and employ a structure-text adapter to couple structure and text information in a contextualized manner. Experimental results demonstrate that FtG achieves substantial performance gain compared to existing state-of-the-art methods. The instruction dataset and code are available at https://github.com/LB0828/FtG.
Anthology ID:
2025.coling-main.740
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11181–11195
Language:
URL:
https://aclanthology.org/2025.coling-main.740/
DOI:
Bibkey:
Cite (ACL):
Ben Liu, Jihai Zhang, Fangquan Lin, Cheng Yang, and Min Peng. 2025. Filter-then-Generate: Large Language Models with Structure-Text Adapter for Knowledge Graph Completion. In Proceedings of the 31st International Conference on Computational Linguistics, pages 11181–11195, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Filter-then-Generate: Large Language Models with Structure-Text Adapter for Knowledge Graph Completion (Liu et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.740.pdf