Augmenting eBooks with with recommended questions using contrastive fine-tuned T5

Shobhan Kumar, Arun Chauhan, Pavan Kumar


Abstract
The recent advances in AI made generation of questions from natural language text possible, the approach is completely excludes human in the loop while generating the appropriate questions which improves the students learning engagement. The ever growing amount of educational content renders it increasingly difficult to manually generate sufficient practice or quiz questions to accompany it. Reading comprehension can be improved by asking the right questions, So, this work offers a Transformer based question generation model for autonomously producing quiz questions from educational information, such as eBooks. This work proposes an contrastive training approach for ‘Text-to-Text Transfer Transformer’ (T5) model where the model (T5-eQG) creates the summarized text for the input document and then automatically generates the questions. Our model shows promising results over earlier neural network-based and rules-based models for question generating task on benchmark datasets and NCERT eBooks.
Anthology ID:
2022.icon-main.15
Volume:
Proceedings of the 19th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2022
Address:
New Delhi, India
Editors:
Md. Shad Akhtar, Tanmoy Chakraborty
Venue:
ICON
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
109–115
Language:
URL:
https://aclanthology.org/2022.icon-main.15
DOI:
Bibkey:
Cite (ACL):
Shobhan Kumar, Arun Chauhan, and Pavan Kumar. 2022. Augmenting eBooks with with recommended questions using contrastive fine-tuned T5. In Proceedings of the 19th International Conference on Natural Language Processing (ICON), pages 109–115, New Delhi, India. Association for Computational Linguistics.
Cite (Informal):
Augmenting eBooks with with recommended questions using contrastive fine-tuned T5 (Kumar et al., ICON 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.icon-main.15.pdf