Linguistic Appropriateness and Pedagogic Usefulness of Reading Comprehension Questions

Andrea Horbach, Itziar Aldabe, Marie Bexte, Oier Lopez de Lacalle, Montse Maritxalar


Abstract
Automatic generation of reading comprehension questions is a topic receiving growing interest in the NLP community, but there is currently no consensus on evaluation metrics and many approaches focus on linguistic quality only while ignoring the pedagogic value and appropriateness of questions. This paper overcomes such weaknesses by a new evaluation scheme where questions from the questionnaire are structured in a hierarchical way to avoid confronting human annotators with evaluation measures that do not make sense for a certain question. We show through an annotation study that our scheme can be applied, but that expert annotators with some level of expertise are needed. We also created and evaluated two new evaluation data sets from the biology domain for Basque and German, composed of questions written by people with an educational background, which will be publicly released. Results show that manually generated questions are in general both of higher linguistic as well as pedagogic quality and that among the human generated questions, teacher-generated ones tend to be most useful.
Anthology ID:
2020.lrec-1.217
Volume:
Proceedings of the 12th Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
1753–1762
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.217
DOI:
Bibkey:
Cite (ACL):
Andrea Horbach, Itziar Aldabe, Marie Bexte, Oier Lopez de Lacalle, and Montse Maritxalar. 2020. Linguistic Appropriateness and Pedagogic Usefulness of Reading Comprehension Questions. In Proceedings of the 12th Language Resources and Evaluation Conference, pages 1753–1762, Marseille, France. European Language Resources Association.
Cite (Informal):
Linguistic Appropriateness and Pedagogic Usefulness of Reading Comprehension Questions (Horbach et al., LREC 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lrec-1.217.pdf