DRLK: Dynamic Hierarchical Reasoning with Language Model and Knowledge Graph for Question Answering

Miao Zhang, Rufeng Dai, Ming Dong, Tingting He


Abstract
In recent years, Graph Neural Network (GNN) approaches with enhanced knowledge graphs (KG) perform well in question answering (QA) tasks. One critical challenge is how to effectively utilize interactions between the QA context and KG. However, existing work only adopts the identical QA context representation to interact with multiple layers of KG, which results in a restricted interaction. In this paper, we propose DRLK (Dynamic Hierarchical Reasoning with Language Model and Knowledge Graphs), a novel model that utilizes dynamic hierarchical interactions between the QA context and KG for reasoning. DRLK extracts dynamic hierarchical features in the QA context, and performs inter-layer and intra-layer interactions on each iteration, allowing the KG representation to be grounded with the hierarchical features of the QA context. We conduct extensive experiments on four benchmark datasets in medical QA and commonsense reasoning. The experimental results demonstrate that DRLK achieves state-of-the-art performances on two benchmark datasets and performs competitively on the others.
Anthology ID:
2022.emnlp-main.342
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5123–5133
Language:
URL:
https://aclanthology.org/2022.emnlp-main.342
DOI:
10.18653/v1/2022.emnlp-main.342
Bibkey:
Cite (ACL):
Miao Zhang, Rufeng Dai, Ming Dong, and Tingting He. 2022. DRLK: Dynamic Hierarchical Reasoning with Language Model and Knowledge Graph for Question Answering. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5123–5133, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
DRLK: Dynamic Hierarchical Reasoning with Language Model and Knowledge Graph for Question Answering (Zhang et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.342.pdf