UniK-QA: Unified Representations of Structured and Unstructured Knowledge for Open-Domain Question Answering

Barlas Oguz, Xilun Chen, Vladimir Karpukhin, Stan Peshterliev, Dmytro Okhonko, Michael Schlichtkrull, Sonal Gupta, Yashar Mehdad, Scott Yih


Abstract
We study open-domain question answering with structured, unstructured and semi-structured knowledge sources, including text, tables, lists and knowledge bases. Departing from prior work, we propose a unifying approach that homogenizes all sources by reducing them to text and applies the retriever-reader model which has so far been limited to text sources only. Our approach greatly improves the results on knowledge-base QA tasks by 11 points, compared to latest graph-based methods. More importantly, we demonstrate that our unified knowledge (UniK-QA) model is a simple and yet effective way to combine heterogeneous sources of knowledge, advancing the state-of-the-art results on two popular question answering benchmarks, NaturalQuestions and WebQuestions, by 3.5 and 2.6 points, respectively. The code of UniK-QA is available at: https://github.com/facebookresearch/UniK-QA.
Anthology ID:
2022.findings-naacl.115
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1535–1546
Language:
URL:
https://aclanthology.org/2022.findings-naacl.115
DOI:
10.18653/v1/2022.findings-naacl.115
Bibkey:
Cite (ACL):
Barlas Oguz, Xilun Chen, Vladimir Karpukhin, Stan Peshterliev, Dmytro Okhonko, Michael Schlichtkrull, Sonal Gupta, Yashar Mehdad, and Scott Yih. 2022. UniK-QA: Unified Representations of Structured and Unstructured Knowledge for Open-Domain Question Answering. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1535–1546, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
UniK-QA: Unified Representations of Structured and Unstructured Knowledge for Open-Domain Question Answering (Oguz et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.115.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.115.mp4
Code
 facebookresearch/UniK-QA
Data
Natural QuestionsTQATriviaQAWebQuestionsWebQuestionsSP