Shobhan Kumar


2022

pdf bib
Augmenting eBooks with with recommended questions using contrastive fine-tuned T5
Shobhan Kumar | Arun Chauhan | Pavan Kumar
Proceedings of the 19th International Conference on Natural Language Processing (ICON)

The recent advances in AI made generation of questions from natural language text possible, the approach is completely excludes human in the loop while generating the appropriate questions which improves the students learning engagement. The ever growing amount of educational content renders it increasingly difficult to manually generate sufficient practice or quiz questions to accompany it. Reading comprehension can be improved by asking the right questions, So, this work offers a Transformer based question generation model for autonomously producing quiz questions from educational information, such as eBooks. This work proposes an contrastive training approach for ‘Text-to-Text Transfer Transformer’ (T5) model where the model (T5-eQG) creates the summarized text for the input document and then automatically generates the questions. Our model shows promising results over earlier neural network-based and rules-based models for question generating task on benchmark datasets and NCERT eBooks.