%0 Conference Proceedings %T PathQG: Neural Question Generation from Facts %A Wang, Siyuan %A Wei, Zhongyu %A Fan, Zhihao %A Huang, Zengfeng %A Sun, Weijian %A Zhang, Qi %A Huang, Xuanjing %Y Webber, Bonnie %Y Cohn, Trevor %Y He, Yulan %Y Liu, Yang %S Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) %D 2020 %8 November %I Association for Computational Linguistics %C Online %F wang-etal-2020-pathqg %X Existing research for question generation encodes the input text as a sequence of tokens without explicitly modeling fact information. These models tend to generate irrelevant and uninformative questions. In this paper, we explore to incorporate facts in the text for question generation in a comprehensive way. We present a novel task of question generation given a query path in the knowledge graph constructed from the input text. We divide the task into two steps, namely, query representation learning and query-based question generation. We formulate query representation learning as a sequence labeling problem for identifying the involved facts to form a query and employ an RNN-based generator for question generation. We first train the two modules jointly in an end-to-end fashion, and further enforce the interaction between these two modules in a variational framework. We construct the experimental datasets on top of SQuAD and results show that our model outperforms other state-of-the-art approaches, and the performance margin is larger when target questions are complex. Human evaluation also proves that our model is able to generate relevant and informative questions. %R 10.18653/v1/2020.emnlp-main.729 %U https://aclanthology.org/2020.emnlp-main.729 %U https://doi.org/10.18653/v1/2020.emnlp-main.729 %P 9066-9075