MicroBERT: Effective Training of Low-resource Monolingual BERTs through Parameter Reduction and Multitask Learning

Luke Gessler, Amir Zeldes


Abstract
BERT-style contextualized word embedding models are critical for good performance in most NLP tasks, but they are data-hungry and therefore difficult to train for low-resource languages. In this work, we investigate whether a combination of greatly reduced model size and two linguistically rich auxiliary pretraining tasks (part-of-speech tagging and dependency parsing) can help produce better BERTs in a low-resource setting. Results from 7 diverse languages indicate that our model, MicroBERT, is able to produce marked improvements in downstream task evaluations, including gains up to 18% for parser LAS and 11% for NER F1 compared to an mBERT baseline, and we achieve these results with less than 1% of the parameter count of a multilingual BERT base–sized model. We conclude that training very small BERTs and leveraging any available labeled data for multitask learning during pretraining can produce models which outperform both their multilingual counterparts and traditional fixed embeddings for low-resource languages.
Anthology ID:
2022.mrl-1.9
Original:
2022.mrl-1.9v1
Version 2:
2022.mrl-1.9v2
Volume:
Proceedings of the 2nd Workshop on Multi-lingual Representation Learning (MRL)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Duygu Ataman, Hila Gonen, Sebastian Ruder, Orhan Firat, Gözde Gül Sahin, Jamshidbek Mirzakhalov
Venue:
MRL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
86–99
Language:
URL:
https://aclanthology.org/2022.mrl-1.9
DOI:
10.18653/v1/2022.mrl-1.9
Bibkey:
Cite (ACL):
Luke Gessler and Amir Zeldes. 2022. MicroBERT: Effective Training of Low-resource Monolingual BERTs through Parameter Reduction and Multitask Learning. In Proceedings of the 2nd Workshop on Multi-lingual Representation Learning (MRL), pages 86–99, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
MicroBERT: Effective Training of Low-resource Monolingual BERTs through Parameter Reduction and Multitask Learning (Gessler & Zeldes, MRL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.mrl-1.9.pdf