CompleQA: Benchmarking the Impacts of Knowledge Graph Completion Methods on Question Answering

Donghan Yu, Yu Gu, Chenyan Xiong, Yiming Yang


Abstract
How much success in Knowledge Graph Completion (KGC) would translate into the performance enhancement in downstream tasks is an important question that has not been studied in depth. In this paper, we introduce a novel benchmark, namely CompleQA, to comprehensively assess the influence of representative KGC methods on Knowledge Graph Question Answering (KGQA), one of the most important downstream applications. This benchmark includes a knowledge graph with 3 million triplets across 5 distinct domains, coupled with over 5000 question-answering pairs and a completion dataset that is well-aligned with these questions. Our evaluation of four well-known KGC methods in combination with two state-of-the-art KGQA systems shows that effective KGC can significantly mitigate the impact of knowledge graph incompleteness on question-answering performance. Surprisingly, we also find that the best-performing KGC method(s) does not necessarily lead to the best QA results, underscoring the need to consider downstream applications when doing KGC.
Anthology ID:
2023.findings-emnlp.849
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12748–12755
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.849
DOI:
10.18653/v1/2023.findings-emnlp.849
Bibkey:
Cite (ACL):
Donghan Yu, Yu Gu, Chenyan Xiong, and Yiming Yang. 2023. CompleQA: Benchmarking the Impacts of Knowledge Graph Completion Methods on Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 12748–12755, Singapore. Association for Computational Linguistics.
Cite (Informal):
CompleQA: Benchmarking the Impacts of Knowledge Graph Completion Methods on Question Answering (Yu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.849.pdf