%0 Conference Proceedings %T GECToR – Grammatical Error Correction: Tag, Not Rewrite %A Omelianchuk, Kostiantyn %A Atrasevych, Vitaliy %A Chernodub, Artem %A Skurzhanskyi, Oleksandr %Y Burstein, Jill %Y Kochmar, Ekaterina %Y Leacock, Claudia %Y Madnani, Nitin %Y Pilán, Ildikó %Y Yannakoudakis, Helen %Y Zesch, Torsten %S Proceedings of the Fifteenth Workshop on Innovative Use of NLP for Building Educational Applications %D 2020 %8 July %I Association for Computational Linguistics %C Seattle, WA, USA → Online %F omelianchuk-etal-2020-gector %X In this paper, we present a simple and efficient GEC sequence tagger using a Transformer encoder. Our system is pre-trained on synthetic data and then fine-tuned in two stages: first on errorful corpora, and second on a combination of errorful and error-free parallel corpora. We design custom token-level transformations to map input tokens to target corrections. Our best single-model/ensemble GEC tagger achieves an F_0.5 of 65.3/66.5 on CONLL-2014 (test) and F_0.5 of 72.4/73.6 on BEA-2019 (test). Its inference speed is up to 10 times as fast as a Transformer-based seq2seq GEC system. %R 10.18653/v1/2020.bea-1.16 %U https://aclanthology.org/2020.bea-1.16 %U https://doi.org/10.18653/v1/2020.bea-1.16 %P 163-170