Starting from “Zero”: An Incremental Zero-shot Learning Approach for Assessing Peer Feedback Comments

Qinjin Jia, Yupeng Cao, Edward Gehringer


Abstract
Peer assessment is an effective and efficient pedagogical strategy for delivering feedback to learners. Asking students to provide quality feedback, which contains suggestions and mentions problems, can promote metacognition by reviewers and better assist reviewees in revising their work. Thus, various supervised machine learning algorithms have been proposed to detect quality feedback. However, all these powerful algorithms have the same Achilles’ heel: the reliance on sufficient historical data. In other words, collecting adequate peer feedback for training a supervised algorithm can take several semesters before the model can be deployed to a new class. In this paper, we present a new paradigm, called incremental zero-shot learning (IZSL), to tackle the problem of lacking sufficient historical data. Our results show that the method can achieve acceptable “cold-start” performance without needing any domain data, and it outperforms BERT when trained on the same data collected incrementally.
Anthology ID:
2022.bea-1.8
Volume:
Proceedings of the 17th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2022)
Month:
July
Year:
2022
Address:
Seattle, Washington
Venues:
BEA | NAACL
SIG:
SIGEDU
Publisher:
Association for Computational Linguistics
Note:
Pages:
46–50
Language:
URL:
https://aclanthology.org/2022.bea-1.8
DOI:
10.18653/v1/2022.bea-1.8
Bibkey:
Cite (ACL):
Qinjin Jia, Yupeng Cao, and Edward Gehringer. 2022. Starting from “Zero”: An Incremental Zero-shot Learning Approach for Assessing Peer Feedback Comments. In Proceedings of the 17th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2022), pages 46–50, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Starting from “Zero”: An Incremental Zero-shot Learning Approach for Assessing Peer Feedback Comments (Jia et al., BEA 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.bea-1.8.pdf