Two-Step Question Retrieval for Open-Domain QA

Yeon Seonwoo, Juhee Son, Jiho Jin, Sang-Woo Lee, Ji-Hoon Kim, Jung-Woo Ha, Alice Oh


Abstract
The retriever-reader pipeline has shown promising performance in open-domain QA but suffers from a very slow inference speed. Recently proposed question retrieval models tackle this problem by indexing question-answer pairs and searching for similar questions. These models have shown a significant increase in inference speed, but at the cost of lower QA performance compared to the retriever-reader models. This paper proposes a two-step question retrieval model, SQuID (Sequential Question-Indexed Dense retrieval) and distant supervision for training. SQuID uses two bi-encoders for question retrieval. The first-step retriever selects top-k similar questions, and the second-step retriever finds the most similar question from the top-k questions. We evaluate the performance and the computational efficiency of SQuID. The results show that SQuID significantly increases the performance of existing question retrieval models with a negligible loss on inference speed.
Anthology ID:
2022.findings-acl.117
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venues:
ACL | Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1487–1492
Language:
URL:
https://aclanthology.org/2022.findings-acl.117
DOI:
10.18653/v1/2022.findings-acl.117
Bibkey:
Cite (ACL):
Yeon Seonwoo, Juhee Son, Jiho Jin, Sang-Woo Lee, Ji-Hoon Kim, Jung-Woo Ha, and Alice Oh. 2022. Two-Step Question Retrieval for Open-Domain QA. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1487–1492, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Two-Step Question Retrieval for Open-Domain QA (Seonwoo et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.117.pdf
Software:
 2022.findings-acl.117.software.zip
Data
Natural QuestionsPAQTriviaQA