Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation

Hui-Juan Wang, Kai-Yu Hsieh, Han-Cheng Yu, Jui-Ching Tsou, Yu An Shih, Chen-Hua Huang, Yao-Chung Fan


Abstract
In this paper, we address the task of cloze-style multiple choice question (MCQs) distractor generation. Our study is featured by the following designs. First, we propose to formulate the cloze distractor generation as a Text2Text task. Second, we propose pseudo Kullback-Leibler Divergence for regulating the generation to consider the item discrimination index in education evaluation. Third, we explore the candidate augmentation strategy and multi-tasking training with cloze-related tasks to further boost the generation performance. Through experiments with benchmarking datasets, our best perfomring model advances the state-of-the-art result from 10.81 to 22.00 (p@1 score).
Anthology ID:
2023.findings-acl.790
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12477–12491
Language:
URL:
https://aclanthology.org/2023.findings-acl.790
DOI:
10.18653/v1/2023.findings-acl.790
Bibkey:
Cite (ACL):
Hui-Juan Wang, Kai-Yu Hsieh, Han-Cheng Yu, Jui-Ching Tsou, Yu An Shih, Chen-Hua Huang, and Yao-Chung Fan. 2023. Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12477–12491, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.790.pdf