Learning to Ask More: Semi-Autoregressive Sequential Question Generation under Dual-Graph Interaction

Zi Chai, Xiaojun Wan


Abstract
Traditional Question Generation (TQG) aims to generate a question given an input passage and an answer. When there is a sequence of answers, we can perform Sequential Question Generation (SQG) to produce a series of interconnected questions. Since the frequently occurred information omission and coreference between questions, SQG is rather challenging. Prior works regarded SQG as a dialog generation task and recurrently produced each question. However, they suffered from problems caused by error cascades and could only capture limited context dependencies. To this end, we generate questions in a semi-autoregressive way. Our model divides questions into different groups and generates each group of them in parallel. During this process, it builds two graphs focusing on information from passages, answers respectively and performs dual-graph interaction to get information for generation. Besides, we design an answer-aware attention mechanism and the coarse-to-fine generation scenario. Experiments on our new dataset containing 81.9K questions show that our model substantially outperforms prior works.
Anthology ID:
2020.acl-main.21
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
225–237
Language:
URL:
https://aclanthology.org/2020.acl-main.21
DOI:
10.18653/v1/2020.acl-main.21
Bibkey:
Cite (ACL):
Zi Chai and Xiaojun Wan. 2020. Learning to Ask More: Semi-Autoregressive Sequential Question Generation under Dual-Graph Interaction. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 225–237, Online. Association for Computational Linguistics.
Cite (Informal):
Learning to Ask More: Semi-Autoregressive Sequential Question Generation under Dual-Graph Interaction (Chai & Wan, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.21.pdf
Video:
 http://slideslive.com/38929330