Generating Self-Contained and Summary-Centric Question Answer Pairs via Differentiable Reward Imitation Learning

Li Zhou, Kevin Small, Yong Zhang, Sandeep Atluri


Abstract
Motivated by suggested question generation in conversational news recommendation systems, we propose a model for generating question-answer pairs (QA pairs) with self-contained, summary-centric questions and length-constrained, article-summarizing answers. We begin by collecting a new dataset of news articles with questions as titles and pairing them with summaries of varying length. This dataset is used to learn a QA pair generation model producing summaries as answers that balance brevity with sufficiency jointly with their corresponding questions. We then reinforce the QA pair generation process with a differentiable reward function to mitigate exposure bias, a common problem in natural language generation. Both automatic metrics and human evaluation demonstrate these QA pairs successfully capture the central gists of the articles and achieve high answer accuracy.
Anthology ID:
2021.emnlp-main.416
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5103–5135
Language:
URL:
https://aclanthology.org/2021.emnlp-main.416
DOI:
10.18653/v1/2021.emnlp-main.416
Bibkey:
Cite (ACL):
Li Zhou, Kevin Small, Yong Zhang, and Sandeep Atluri. 2021. Generating Self-Contained and Summary-Centric Question Answer Pairs via Differentiable Reward Imitation Learning. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5103–5135, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Generating Self-Contained and Summary-Centric Question Answer Pairs via Differentiable Reward Imitation Learning (Zhou et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.416.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.416.mp4
Code
 amazon-research/sc2qa-dril
Data
Natural QuestionsNewsQASQuAD