Sentiment Analysis For Bengali Using Transformer Based Models

Anirban Bhowmick, Abhik Jana


Abstract
Sentiment analysis is one of the key Natural Language Processing (NLP) tasks that has been attempted by researchers extensively for resource-rich languages like English. But for low resource languages like Bengali very few attempts have been made due to various reasons including lack of corpora to train machine learning models or lack of gold standard datasets for evaluation. However, with the emergence of transformer models pre-trained in several languages, researchers are showing interest to investigate the applicability of these models in several NLP tasks, especially for low resource languages. In this paper, we investigate the usefulness of two pre-trained transformers models namely multilingual BERT and XLM-RoBERTa (with fine-tuning) for sentiment analysis for the Bengali Language. We use three datasets for the Bengali language for evaluation and produce state-of-the-art performance, even reaching a maximum of 95% accuracy for a two-class sentiment classification task. We believe, this work can serve as a good benchmark as far as sentiment analysis for the Bengali language is concerned.
Anthology ID:
2021.icon-main.58
Volume:
Proceedings of the 18th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2021
Address:
National Institute of Technology Silchar, Silchar, India
Editors:
Sivaji Bandyopadhyay, Sobha Lalitha Devi, Pushpak Bhattacharyya
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
481–486
Language:
URL:
https://aclanthology.org/2021.icon-main.58
DOI:
Bibkey:
Cite (ACL):
Anirban Bhowmick and Abhik Jana. 2021. Sentiment Analysis For Bengali Using Transformer Based Models. In Proceedings of the 18th International Conference on Natural Language Processing (ICON), pages 481–486, National Institute of Technology Silchar, Silchar, India. NLP Association of India (NLPAI).
Cite (Informal):
Sentiment Analysis For Bengali Using Transformer Based Models (Bhowmick & Jana, ICON 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.icon-main.58.pdf
Optional supplementary material:
 2021.icon-main.58.OptionalSupplementaryMaterial.pdf