Closing the Loop: Learning to Generate Writing Feedback via Language Model Simulated Student Revisions

Inderjeet Nair, Jiaye Tan, Xiaotian Su, Anne Gere, Xu Wang, Lu Wang


Abstract
Providing feedback is widely recognized as crucial for refining students’ writing skills. Recent advances in language models (LMs) have made it possible to automatically generate feedback that is actionable and well-aligned with human-specified attributes. However, it remains unclear whether the feedback generated by these models is truly effective in enhancing the quality of student revisions. Moreover, prompting LMs with a precise set of instructions to generate feedback is nontrivial due to the lack of consensus regarding the specific attributes that can lead to improved revising performance. To address these challenges, we propose PROF that PROduces Feedback via learning from LM simulated student revisions. PROF aims to iteratively optimize the feedback generator by directly maximizing the effectiveness of students’ overall revising performance as simulated by LMs. Focusing on an economic essay assignment, we empirically test the efficacy of PROF and observe that our approach not only surpasses a variety of baseline methods in effectiveness of improving students’ writing but also demonstrates enhanced pedagogical values, even though it was not explicitly trained for this aspect.
Anthology ID:
2024.emnlp-main.928
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16636–16657
Language:
URL:
https://aclanthology.org/2024.emnlp-main.928
DOI:
Bibkey:
Cite (ACL):
Inderjeet Nair, Jiaye Tan, Xiaotian Su, Anne Gere, Xu Wang, and Lu Wang. 2024. Closing the Loop: Learning to Generate Writing Feedback via Language Model Simulated Student Revisions. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 16636–16657, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Closing the Loop: Learning to Generate Writing Feedback via Language Model Simulated Student Revisions (Nair et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.928.pdf