ConQuest: Contextual Question Paraphrasing through Answer-Aware Synthetic Question Generation

Mostafa Mirshekari, Jing Gu, Aaron Sisto


Abstract
Despite excellent performance on tasks such as question answering, Transformer-based architectures remain sensitive to syntactic and contextual ambiguities. Question Paraphrasing (QP) offers a promising solution as a means to augment existing datasets. The main challenges of current QP models include lack of training data and difficulty in generating diverse and natural questions. In this paper, we present Conquest, a framework for generating synthetic datasets for contextual question paraphrasing. To this end, Conquest first employs an answer-aware question generation (QG) model to create a question-pair dataset and then uses this data to train a contextualized question paraphrasing model. We extensively evaluate Conquest and show its ability to produce more diverse and fluent question pairs than existing approaches. Our contextual paraphrase model also establishes a strong baseline for end-to-end contextual paraphrasing. Further, We find that context can improve BLEU-1 score on contextual compression and expansion by 4.3 and 11.2 respectively, compared to a non-contextual model.
Anthology ID:
2021.wnut-1.25
Volume:
Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021)
Month:
November
Year:
2021
Address:
Online
Editors:
Wei Xu, Alan Ritter, Tim Baldwin, Afshin Rahimi
Venue:
WNUT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
222–229
Language:
URL:
https://aclanthology.org/2021.wnut-1.25
DOI:
10.18653/v1/2021.wnut-1.25
Bibkey:
Cite (ACL):
Mostafa Mirshekari, Jing Gu, and Aaron Sisto. 2021. ConQuest: Contextual Question Paraphrasing through Answer-Aware Synthetic Question Generation. In Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021), pages 222–229, Online. Association for Computational Linguistics.
Cite (Informal):
ConQuest: Contextual Question Paraphrasing through Answer-Aware Synthetic Question Generation (Mirshekari et al., WNUT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wnut-1.25.pdf
Data
SQuAD