Path Spuriousness-aware Reinforcement Learning for Multi-Hop Knowledge Graph Reasoning

Chunyang Jiang, Tianchen Zhu, Haoyi Zhou, Chang Liu, Ting Deng, Chunming Hu, Jianxin Li


Abstract
Multi-hop reasoning, a prevalent approach for query answering, aims at inferring new facts along reasonable paths over a knowledge graph. Reinforcement learning methods can be adopted by formulating the problem into a Markov decision process. However, common suffering within RL-based reasoning models is that the agent can be biased to spurious paths which coincidentally lead to the correct answer with poor explanation. In this work, we take a deep dive into this phenomenon and define a metric named Path Spuriousness (PS), to quantitatively estimate to what extent a path is spurious. Guided by the definition of PS, we design a model with a new reward that considers both answer accuracy and path reasonableness. We test our method on four datasets and experiments reveal that our method considerably enhances the agent’s capacity to prevent spurious paths while keeping comparable to state-of-the-art performance.
Anthology ID:
2023.eacl-main.232
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3181–3192
Language:
URL:
https://aclanthology.org/2023.eacl-main.232
DOI:
10.18653/v1/2023.eacl-main.232
Bibkey:
Cite (ACL):
Chunyang Jiang, Tianchen Zhu, Haoyi Zhou, Chang Liu, Ting Deng, Chunming Hu, and Jianxin Li. 2023. Path Spuriousness-aware Reinforcement Learning for Multi-Hop Knowledge Graph Reasoning. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3181–3192, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Path Spuriousness-aware Reinforcement Learning for Multi-Hop Knowledge Graph Reasoning (Jiang et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.232.pdf
Video:
 https://aclanthology.org/2023.eacl-main.232.mp4