Improving Zero-shot Reader by Reducing Distractions from Irrelevant Documents in Open-Domain Question Answering

Sukmin Cho, Jeongyeon Seo, Soyeong Jeong, Jong Park


Abstract
Large language models (LLMs) enable zero-shot approaches in open-domain question answering (ODQA), yet with limited advancements as the reader is compared to the retriever. This study aims at the feasibility of a zero-shot reader that addresses the challenges of computational cost and the need for labeled data. We find that LLMs are distracted due to irrelevant documents in the retrieved set and the overconfidence of the generated answers when they are exploited as zero-shot readers. To tackle these problems, we mitigate the impact of such documents via Distraction-aware Answer Selection (DAS) with a negation-based instruction and score adjustment for proper answer selection. Experimental results show that our approach successfully handles distraction across diverse scenarios, enhancing the performance of zero-shot readers. Furthermore, unlike supervised readers struggling with unseen data, zero-shot readers demonstrate outstanding transferability without any training.
Anthology ID:
2023.findings-emnlp.207
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3145–3157
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.207
DOI:
10.18653/v1/2023.findings-emnlp.207
Bibkey:
Cite (ACL):
Sukmin Cho, Jeongyeon Seo, Soyeong Jeong, and Jong Park. 2023. Improving Zero-shot Reader by Reducing Distractions from Irrelevant Documents in Open-Domain Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 3145–3157, Singapore. Association for Computational Linguistics.
Cite (Informal):
Improving Zero-shot Reader by Reducing Distractions from Irrelevant Documents in Open-Domain Question Answering (Cho et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.207.pdf