PerKGQA: Question Answering over Personalized Knowledge Graphs

Ritam Dutt, Kasturi Bhattacharjee, Rashmi Gangadharaiah, Dan Roth, Carolyn Rose


Abstract
Previous studies on question answering over knowledge graphs have typically operated over a single knowledge graph (KG). This KG is assumed to be known a priori and is lever- aged similarly for all users’ queries during inference. However, such an assumption is not applicable to real-world settings, such as health- care, where one needs to handle queries of new users over unseen KGs during inference. Furthermore, privacy concerns and high computational costs render it infeasible to query the single KG that has information about all users while answering a specific user’s query. The above concerns motivate our question answer- ing setting over personalized knowledge graphs (PERKGQA) where each user has restricted access to their KG. We observe that current state-of-the-art KGQA methods that require learning prior node representations fare poorly. We propose two complementary approaches, PATHCBR and PATHRGCN for PERKGQA. The former is a simple non-parametric technique that employs case-based reasoning, while the latter is a parametric approach using graph neural networks. Our proposed methods circumvent learning prior representations, can generalize to unseen KGs, and outperform strong baselines on an academic and an internal dataset by 6.5% and 10.5%.
Anthology ID:
2022.findings-naacl.19
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
253–268
Language:
URL:
https://aclanthology.org/2022.findings-naacl.19
DOI:
10.18653/v1/2022.findings-naacl.19
Bibkey:
Cite (ACL):
Ritam Dutt, Kasturi Bhattacharjee, Rashmi Gangadharaiah, Dan Roth, and Carolyn Rose. 2022. PerKGQA: Question Answering over Personalized Knowledge Graphs. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 253–268, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
PerKGQA: Question Answering over Personalized Knowledge Graphs (Dutt et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.19.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.19.mp4