Yin Dawei


2021

pdf bib
Enhancing Question Generation with Commonsense Knowledge
Jia Xin | Wang Hao | Yin Dawei | Wu Yunfang
Proceedings of the 20th Chinese National Conference on Computational Linguistics

Question generation (QG) is to generate natural and grammatical questions that can be answeredby a specific answer for a given context. Previous sequence-to-sequence models suffer from aproblem that asking high-quality questions requires commonsense knowledge as backgrounds which in most cases can not be learned directly from training data resulting in unsatisfactory questions deprived of knowledge. In this paper we propose a multi-task learning framework tointroduce commonsense knowledge into question generation process. We first retrieve relevant commonsense knowledge triples from mature databases and select triples with the conversion information from source context to question. Based on these informative knowledge triples wedesign two auxiliary tasks to incorporate commonsense knowledge into the main QG modelwhere one task is Concept Relation Classification and the other is Tail Concept Generation. Ex-perimental results on SQuAD show that our proposed methods are able to noticeably improvethe QG performance on both automatic and human evaluation metrics demonstrating that incor-porating external commonsense knowledge with multi-task learning can help the model generatehuman-like and high-quality questions.
Search
Venues