PreWoMe: Exploiting Presuppositions as Working Memory for Long Form Question Answering

Wookje Han, Jinsol Park, Kyungjae Lee


Abstract
Information-seeking questions in long-form question answering (LFQA) often prove misleading due to ambiguity or false presupposition in the question. While many existing approaches handle misleading questions, they are tailored to limited questions, which are insufficient in a real-world setting with unpredictable input characteristics. In this work, we propose PreWoMe, a unified approach capable of handling any type of information-seeking question. The key idea of PreWoMe involves extracting presuppositions in the question and exploiting them as working memory to generate feedback and action about the question. Our experiment shows that PreWoMe is effective not only in tackling misleading questions but also in handling normal ones, thereby demonstrating the effectiveness of leveraging presuppositions, feedback, and action for real-world QA settings.
Anthology ID:
2023.emnlp-main.517
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8312–8322
Language:
URL:
https://aclanthology.org/2023.emnlp-main.517
DOI:
10.18653/v1/2023.emnlp-main.517
Bibkey:
Cite (ACL):
Wookje Han, Jinsol Park, and Kyungjae Lee. 2023. PreWoMe: Exploiting Presuppositions as Working Memory for Long Form Question Answering. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 8312–8322, Singapore. Association for Computational Linguistics.
Cite (Informal):
PreWoMe: Exploiting Presuppositions as Working Memory for Long Form Question Answering (Han et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.517.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.517.mp4