Are the Multilingual Models Better? Improving Czech Sentiment with Transformers

Pavel Přibáň, Josef Steinberger


Abstract
In this paper, we aim at improving Czech sentiment with transformer-based models and their multilingual versions. More concretely, we study the task of polarity detection for the Czech language on three sentiment polarity datasets. We fine-tune and perform experiments with five multilingual and three monolingual models. We compare the monolingual and multilingual models’ performance, including comparison with the older approach based on recurrent neural networks. Furthermore, we test the multilingual models and their ability to transfer knowledge from English to Czech (and vice versa) with zero-shot cross-lingual classification. Our experiments show that the huge multilingual models can overcome the performance of the monolingual models. They are also able to detect polarity in another language without any training data, with performance not worse than 4.4 % compared to state-of-the-art monolingual trained models. Moreover, we achieved new state-of-the-art results on all three datasets.
Anthology ID:
2021.ranlp-1.128
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
1138–1149
Language:
URL:
https://aclanthology.org/2021.ranlp-1.128
DOI:
Bibkey:
Cite (ACL):
Pavel Přibáň and Josef Steinberger. 2021. Are the Multilingual Models Better? Improving Czech Sentiment with Transformers. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 1138–1149, Held Online. INCOMA Ltd..
Cite (Informal):
Are the Multilingual Models Better? Improving Czech Sentiment with Transformers (Přibáň & Steinberger, RANLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.ranlp-1.128.pdf
Code
 pauli31/improving-czech-sentiment-transformers
Data
IMDb Movie Reviews