Maria Trofimova
2019
Tuning Multilingual Transformers for Language-Specific Named Entity Recognition
Mikhail Arkhipov
|
Maria Trofimova
|
Yuri Kuratov
|
Alexey Sorokin
Proceedings of the 7th Workshop on Balto-Slavic Natural Language Processing
Our paper addresses the problem of multilingual named entity recognition on the material of 4 languages: Russian, Bulgarian, Czech and Polish. We solve this task using the BERT model. We use a hundred languages multilingual model as base for transfer to the mentioned Slavic languages. Unsupervised pre-training of the BERT model on these 4 languages allows to significantly outperform baseline neural approaches and multilingual BERT. Additional improvement is achieved by extending BERT with a word-level CRF layer. Our system was submitted to BSNLP 2019 Shared Task on Multilingual Named Entity Recognition and demonstrated top performance in multilingual setting for two competition metrics. We open-sourced NER models and BERT model pre-trained on the four Slavic languages.
Search