HerBERT: Efficiently Pretrained Transformer-based Language Model for Polish

Robert Mroczkowski, Piotr Rybak, Alina Wróblewska, Ireneusz Gawlik


Abstract
BERT-based models are currently used for solving nearly all Natural Language Processing (NLP) tasks and most often achieve state-of-the-art results. Therefore, the NLP community conducts extensive research on understanding these models, but above all on designing effective and efficient training procedures. Several ablation studies investigating how to train BERT-like models have been carried out, but the vast majority of them concerned only the English language. A training procedure designed for English does not have to be universal and applicable to other especially typologically different languages. Therefore, this paper presents the first ablation study focused on Polish, which, unlike the isolating English language, is a fusional language. We design and thoroughly evaluate a pretraining procedure of transferring knowledge from multilingual to monolingual BERT-based models. In addition to multilingual model initialization, other factors that possibly influence pretraining are also explored, i.e. training objective, corpus size, BPE-Dropout, and pretraining length. Based on the proposed procedure, a Polish BERT-based language model – HerBERT – is trained. This model achieves state-of-the-art results on multiple downstream tasks.
Anthology ID:
2021.bsnlp-1.1
Volume:
Proceedings of the 8th Workshop on Balto-Slavic Natural Language Processing
Month:
April
Year:
2021
Address:
Kiyv, Ukraine
Editors:
Bogdan Babych, Olga Kanishcheva, Preslav Nakov, Jakub Piskorski, Lidia Pivovarova, Vasyl Starko, Josef Steinberger, Roman Yangarber, Michał Marcińczuk, Senja Pollak, Pavel Přibáň, Marko Robnik-Šikonja
Venue:
BSNLP
SIG:
SIGSLAV
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://aclanthology.org/2021.bsnlp-1.1
DOI:
Bibkey:
Cite (ACL):
Robert Mroczkowski, Piotr Rybak, Alina Wróblewska, and Ireneusz Gawlik. 2021. HerBERT: Efficiently Pretrained Transformer-based Language Model for Polish. In Proceedings of the 8th Workshop on Balto-Slavic Natural Language Processing, pages 1–10, Kiyv, Ukraine. Association for Computational Linguistics.
Cite (Informal):
HerBERT: Efficiently Pretrained Transformer-based Language Model for Polish (Mroczkowski et al., BSNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.bsnlp-1.1.pdf
Data
CCNetKLEJOpenSubtitlesPSCPolEmo 2.0