%0 Conference Proceedings %T Sesame Street to Mount Sinai: BERT-constrained character-level Moses models for multilingual lexical normalization %A Scherrer, Yves %A Ljubešić, Nikola %Y Xu, Wei %Y Ritter, Alan %Y Baldwin, Tim %Y Rahimi, Afshin %S Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021) %D 2021 %8 November %I Association for Computational Linguistics %C Online %F scherrer-ljubesic-2021-sesame %X This paper describes the HEL-LJU submissions to the MultiLexNorm shared task on multilingual lexical normalization. Our system is based on a BERT token classification preprocessing step, where for each token the type of the necessary transformation is predicted (none, uppercase, lowercase, capitalize, modify), and a character-level SMT step where the text is translated from original to normalized given the BERT-predicted transformation constraints. For some languages, depending on the results on development data, the training data was extended by back-translating OpenSubtitles data. In the final ordering of the ten participating teams, the HEL-LJU team has taken the second place, scoring better than the previous state-of-the-art. %R 10.18653/v1/2021.wnut-1.52 %U https://aclanthology.org/2021.wnut-1.52 %U https://doi.org/10.18653/v1/2021.wnut-1.52 %P 465-472