Exploring Methods for Generating Feedback Comments for Writing Learning

Kazuaki Hanawa, Ryo Nagata, Kentaro Inui


Abstract
The task of generating explanatory notes for language learners is known as feedback comment generation. Although various generation techniques are available, little is known about which methods are appropriate for this task. Nagata (2019) demonstrates the effectiveness of neural-retrieval-based methods in generating feedback comments for preposition use. Retrieval-based methods have limitations in that they can only output feedback comments existing in a given training data. Furthermore, feedback comments can be made on other grammatical and writing items than preposition use, which is still unaddressed. To shed light on these points, we investigate a wider range of methods for generating many feedback comments in this study. Our close analysis of the type of task leads us to investigate three different architectures for comment generation: (i) a neural-retrieval-based method as a baseline, (ii) a pointer-generator-based generation method as a neural seq2seq method, (iii) a retrieve-and-edit method, a hybrid of (i) and (ii). Intuitively, the pointer-generator should outperform neural-retrieval, and retrieve-and-edit should perform best. However, in our experiments, this expectation is completely overturned. We closely analyze the results to reveal the major causes of these counter-intuitive results and report on our findings from the experiments.
Anthology ID:
2021.emnlp-main.766
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9719–9730
Language:
URL:
https://aclanthology.org/2021.emnlp-main.766
DOI:
10.18653/v1/2021.emnlp-main.766
Bibkey:
Cite (ACL):
Kazuaki Hanawa, Ryo Nagata, and Kentaro Inui. 2021. Exploring Methods for Generating Feedback Comments for Writing Learning. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9719–9730, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Exploring Methods for Generating Feedback Comments for Writing Learning (Hanawa et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.766.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.766.mp4
Code
 k-hanawa/fcg_emnlp2021