%0 Conference Proceedings %T Non-sentential Question Resolution using Sequence to Sequence Learning %A Kumar, Vineet %A Joshi, Sachindra %Y Matsumoto, Yuji %Y Prasad, Rashmi %S Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers %D 2016 %8 December %I The COLING 2016 Organizing Committee %C Osaka, Japan %F kumar-joshi-2016-non %X An interactive Question Answering (QA) system frequently encounters non-sentential (incomplete) questions. These non-sentential questions may not make sense to the system when a user asks them without the context of conversation. The system thus needs to take into account the conversation context to process the question. In this work, we present a recurrent neural network (RNN) based encoder decoder network that can generate a complete (intended) question, given an incomplete question and conversation context. RNN encoder decoder networks have been show to work well when trained on a parallel corpus with millions of sentences, however it is extremely hard to obtain conversation data of this magnitude. We therefore propose to decompose the original problem into two separate simplified problems where each problem focuses on an abstraction. Specifically, we train a semantic sequence model to learn semantic patterns, and a syntactic sequence model to learn linguistic patterns. We further combine syntactic and semantic sequence models to generate an ensemble model. Our model achieves a BLEU score of 30.15 as compared to 18.54 using a standard RNN encoder decoder model. %U https://aclanthology.org/C16-1190 %P 2022-2031