Learning Answer Generation using Supervision from Automatic Question Answering Evaluators

Matteo Gabburo, Siddhant Garg, Rik Koncel-Kedziorski, Alessandro Moschitti


Abstract
Recent studies show that sentence-level extractive QA, i.e., based on Answer Sentence Selection (AS2), is outperformed by Generation-based QA (GenQA) models, which generate answers using the top-k answer sentences ranked by AS2 models (a la retrieval-augmented generation style). In this paper, we propose a novel training paradigm for GenQA using supervision from automatic QA evaluation models (GAVA). Specifically, we propose three strategies to transfer knowledge from these QA evaluation models to a GenQA model: (i) augmenting training data with answers generated by the GenQA model and labelled by GAVA (either statically, before training, or (ii) dynamically, at every training epoch); and (iii) using the GAVA score for weighting the generator loss during the learning of the GenQA model. We evaluate our proposed methods on two academic and one industrial dataset, obtaining a significant improvement in answering accuracy over the previous state of the art.
Anthology ID:
2023.acl-long.467
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8389–8403
Language:
URL:
https://aclanthology.org/2023.acl-long.467
DOI:
10.18653/v1/2023.acl-long.467
Bibkey:
Cite (ACL):
Matteo Gabburo, Siddhant Garg, Rik Koncel-Kedziorski, and Alessandro Moschitti. 2023. Learning Answer Generation using Supervision from Automatic Question Answering Evaluators. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8389–8403, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Learning Answer Generation using Supervision from Automatic Question Answering Evaluators (Gabburo et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.467.pdf
Video:
 https://aclanthology.org/2023.acl-long.467.mp4