Memory Graph Networks for Explainable Memory-grounded Question Answering

Seungwhan Moon, Pararth Shah, Anuj Kumar, Rajen Subba


Abstract
We introduce Episodic Memory QA, the task of answering personal user questions grounded on memory graph (MG), where episodic memories and related entity nodes are connected via relational edges. We create a new benchmark dataset first by generating synthetic memory graphs with simulated attributes, and by composing 100K QA pairs for the generated MG with bootstrapped scripts. To address the unique challenges for the proposed task, we propose Memory Graph Networks (MGN), a novel extension of memory networks to enable dynamic expansion of memory slots through graph traversals, thus able to answer queries in which contexts from multiple linked episodes and external knowledge are required. We then propose the Episodic Memory QA Net with multiple module networks to effectively handle various question types. Empirical results show improvement over the QA baselines in top-k answer prediction accuracy in the proposed task. The proposed model also generates a graph walk path and attention vectors for each predicted answer, providing a natural way to explain its QA reasoning.
Anthology ID:
K19-1068
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
728–736
Language:
URL:
https://aclanthology.org/K19-1068
DOI:
10.18653/v1/K19-1068
Bibkey:
Cite (ACL):
Seungwhan Moon, Pararth Shah, Anuj Kumar, and Rajen Subba. 2019. Memory Graph Networks for Explainable Memory-grounded Question Answering. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 728–736, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Memory Graph Networks for Explainable Memory-grounded Question Answering (Moon et al., CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1068.pdf