Evaluation of Question Generation Needs More References

Shinhyeok Oh, Hyojun Go, Hyeongdon Moon, Yunsung Lee, Myeongho Jeong, Hyun Seung Lee, Seungtaek Choi


Abstract
Question generation (QG) is the task of generating a valid and fluent question based on a given context and the target answer. According to various purposes, even given the same context, instructors can ask questions about different concepts, and even the same concept can be written in different ways. However, the evaluation for QG usually depends on single reference-based similarity metrics, such as n-gram-based metric or learned metric, which is not sufficient to fully evaluate the potential of QG methods. To this end, we propose to paraphrase the reference question for a more robust QG evaluation. Using large language models such as GPT-3, we created semantically and syntactically diverse questions, then adopt the simple aggregation of the popular evaluation metrics as the final scores. Through our experiments, we found that using multiple (pseudo) references is more effective for QG evaluation while showing a higher correlation with human evaluations than evaluation with a single reference.
Anthology ID:
2023.findings-acl.396
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6358–6367
Language:
URL:
https://aclanthology.org/2023.findings-acl.396
DOI:
10.18653/v1/2023.findings-acl.396
Bibkey:
Cite (ACL):
Shinhyeok Oh, Hyojun Go, Hyeongdon Moon, Yunsung Lee, Myeongho Jeong, Hyun Seung Lee, and Seungtaek Choi. 2023. Evaluation of Question Generation Needs More References. In Findings of the Association for Computational Linguistics: ACL 2023, pages 6358–6367, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Evaluation of Question Generation Needs More References (Oh et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.396.pdf
Video:
 https://aclanthology.org/2023.findings-acl.396.mp4