Multi-headed Architecture Based on BERT for Grammatical Errors Correction
Bohdan Didenko | Julia Shaptala
Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications
In this paper, we describe our approach to GEC using the BERT model for creation of encoded representation and some of our enhancements, namely, “Heads” are fully-connected networks which are used for finding the errors and later receive recommendation from the networks on dealing with a highlighted part of the sentence only. Among the main advantages of our solution is increasing the system productivity and lowering the time of processing while keeping the high accuracy of GEC results.