Improving Automated Distractor Generation for Math Multiple-choice Questions with Overgenerate-and-rank

Alexander Scarlatos, Wanyong Feng, Andrew Lan, Simon Woodhead, Digory Smith


Abstract
Multiple-choice questions (MCQs) are commonly used across all levels of math education since they can be deployed and graded at a large scale. A critical component of MCQs is the distractors, i.e., incorrect answers crafted to reflect student errors or misconceptions. Automatically generating them in math MCQs, e.g., with large language models, has been challenging. In this work, we propose a novel method to enhance the quality of generated distractors through overgenerate-and-rank, training a ranking model to predict how likely distractors are to be selected by real students. Experimental results on a real-world dataset and human evaluation with math teachers show that our ranking model increases alignment with human-authored distractors, although human-authored ones are still preferred over generated ones.
Anthology ID:
2024.bea-1.19
Volume:
Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Ekaterina Kochmar, Marie Bexte, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Anaïs Tack, Victoria Yaneva, Zheng Yuan
Venue:
BEA
SIG:
SIGEDU
Publisher:
Association for Computational Linguistics
Note:
Pages:
222–231
Language:
URL:
https://aclanthology.org/2024.bea-1.19
DOI:
Bibkey:
Cite (ACL):
Alexander Scarlatos, Wanyong Feng, Andrew Lan, Simon Woodhead, and Digory Smith. 2024. Improving Automated Distractor Generation for Math Multiple-choice Questions with Overgenerate-and-rank. In Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024), pages 222–231, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Improving Automated Distractor Generation for Math Multiple-choice Questions with Overgenerate-and-rank (Scarlatos et al., BEA 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.bea-1.19.pdf