DCMKC: A Dual Consistency Matching Approach for Multi-hop Question Answering in LLMs

Xinyi Wang, Yiping Song, Chang Liu, Tingjin Luo, Bo Liu, Zheng Xie, Minlie Huang


Abstract
Reasoning based on chains of thought (CoTs) enables large language models (LLMs) to solve problems by thinking step by step and becomes the mainstream solution for Question-Answering (QA) tasks. Knowledge graph (KG)-enhanced CoT technology helps correct factual errors or predict reasoning direction. Existing KG-enhanced methods find relevant information in KGs “within” each reasoning step of CoTs. However, in some cases, logical connections “between” reasoning steps may be missing or wrong, leading to broken reasoning chains and wrong reasoning direction. To solve the above problem, we argue that the errors between reasoning steps require collaborative verification and mining of multiple triplets and multiple paths in KG. So we propose the DCMKC (Dual Consistency Matching for KG and CoT) method, aiming to maintain semantic and structural consistency between KG and CoT. The main idea is to convert CoTs and KGs into two granularity-aligned graphs, transforming multi-hop reasoning and KG matching into iterative matching and modification of two graphs. In each iteration, DCMKC matches the KG reasoning chains with CoTs based on semantic similarity and judges the structural consistency between them. Then it modifies CoTs using the matched chains. After iterations, the CoTs and KG reasoning chains reach high semantic and structural consistency, which is theoretically and experimentally demonstrated by kernel and spectral methods. The two kinds of chains are then used to generate the final answers. Experimental results show that our method outperforms baselines on multiple datasets, especially on multi-answer questions, with up to 5.1% improvement over the baseline.
Anthology ID:
2025.findings-emnlp.16
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
259–273
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.16/
DOI:
Bibkey:
Cite (ACL):
Xinyi Wang, Yiping Song, Chang Liu, Tingjin Luo, Bo Liu, Zheng Xie, and Minlie Huang. 2025. DCMKC: A Dual Consistency Matching Approach for Multi-hop Question Answering in LLMs. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 259–273, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
DCMKC: A Dual Consistency Matching Approach for Multi-hop Question Answering in LLMs (Wang et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.16.pdf
Checklist:
 2025.findings-emnlp.16.checklist.pdf