RetinaQA: A Robust Knowledge Base Question Answering Model for both Answerable and Unanswerable Questions

Prayushi Faldu, Indrajit Bhattacharya, Mausam .


Abstract
An essential requirement for a real-world Knowledge Base Question Answering (KBQA) system is the ability to detect the answerability of questions when generating logical forms. However, state-of-the-art KBQA models assume all questions to be answerable. Recent research has found that such models, when superficially adapted to detect answerability, struggle to satisfactorily identify the different categories of unanswerable questions, and simultaneously preserve good performance for answerable questions. Towards addressing this issue, we propose RetinaQA, a new KBQA model that unifies two key ideas in a single KBQA architecture: (a) discrimination over candidate logical forms, rather than generating these, for handling schema-related unanswerability, and (b) sketch-filling-based construction of candidate logical forms for handling data-related unaswerability. Our results show that RetinaQA significantly outperforms adaptations of state-of-the-art KBQA models in handling both answerable and unanswerable questions and demonstrates robustness across all categories of unanswerability. Notably, RetinaQA also sets a new state-of-the-art for answerable KBQA, surpassing existing models. We release our code base for further research: https://github.com/dair-iitd/RetinaQA.
Anthology ID:
2024.acl-long.359
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6643–6656
Language:
URL:
https://aclanthology.org/2024.acl-long.359
DOI:
Bibkey:
Cite (ACL):
Prayushi Faldu, Indrajit Bhattacharya, and Mausam .. 2024. RetinaQA: A Robust Knowledge Base Question Answering Model for both Answerable and Unanswerable Questions. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6643–6656, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
RetinaQA: A Robust Knowledge Base Question Answering Model for both Answerable and Unanswerable Questions (Faldu et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.359.pdf