Comparing domain-specific and domain-general BERT variants for inferred real-world knowledge through rare grammatical features in Serbian

Sofia Lee, Jelke Bloem


Abstract
Transfer learning is one of the prevailing approaches towards training language-specific BERT models. However, some languages have uncommon features that may prove to be challenging to more domain-general models but not domain-specific models. Comparing the performance of BERTić, a Bosnian-Croatian-Montenegrin-Serbian model, and Multilingual BERT on a Named-Entity Recognition (NER) task and Masked Language Modelling (MLM) task based around a rare phenomenon of indeclinable female foreign names in Serbian reveals how the different training approaches impacts their performance. Multilingual BERT is shown to perform better than BERTić in the NER task, but BERTić greatly exceeds in the MLM task. Thus, there are applications both for domain-general training and domain-specific training depending on the tasks at hand.
Anthology ID:
2023.bsnlp-1.7
Volume:
Proceedings of the 9th Workshop on Slavic Natural Language Processing 2023 (SlavicNLP 2023)
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Jakub Piskorski, Michał Marcińczuk, Preslav Nakov, Maciej Ogrodniczuk, Senja Pollak, Pavel Přibáň, Piotr Rybak, Josef Steinberger, Roman Yangarber
Venue:
BSNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
47–60
Language:
URL:
https://aclanthology.org/2023.bsnlp-1.7
DOI:
10.18653/v1/2023.bsnlp-1.7
Bibkey:
Cite (ACL):
Sofia Lee and Jelke Bloem. 2023. Comparing domain-specific and domain-general BERT variants for inferred real-world knowledge through rare grammatical features in Serbian. In Proceedings of the 9th Workshop on Slavic Natural Language Processing 2023 (SlavicNLP 2023), pages 47–60, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Comparing domain-specific and domain-general BERT variants for inferred real-world knowledge through rare grammatical features in Serbian (Lee & Bloem, BSNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.bsnlp-1.7.pdf
Video:
 https://aclanthology.org/2023.bsnlp-1.7.mp4