Attention-based Recurrent Convolutional Neural Network for Automatic Essay Scoring

Fei Dong, Yue Zhang, Jie Yang


Abstract
Neural network models have recently been applied to the task of automatic essay scoring, giving promising results. Existing work used recurrent neural networks and convolutional neural networks to model input essays, giving grades based on a single vector representation of the essay. On the other hand, the relative advantages of RNNs and CNNs have not been compared. In addition, different parts of the essay can contribute differently for scoring, which is not captured by existing models. We address these issues by building a hierarchical sentence-document model to represent essays, using the attention mechanism to automatically decide the relative weights of words and sentences. Results show that our model outperforms the previous state-of-the-art methods, demonstrating the effectiveness of the attention mechanism.
Anthology ID:
K17-1017
Volume:
Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)
Month:
August
Year:
2017
Address:
Vancouver, Canada
Editors:
Roger Levy, Lucia Specia
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
153–162
Language:
URL:
https://aclanthology.org/K17-1017
DOI:
10.18653/v1/K17-1017
Bibkey:
Cite (ACL):
Fei Dong, Yue Zhang, and Jie Yang. 2017. Attention-based Recurrent Convolutional Neural Network for Automatic Essay Scoring. In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017), pages 153–162, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Attention-based Recurrent Convolutional Neural Network for Automatic Essay Scoring (Dong et al., CoNLL 2017)
Copy Citation:
PDF:
https://aclanthology.org/K17-1017.pdf