Sentiment Analysis of Sinhala News Comments Using Transformers

Isuru Bandaranayake, Hakim Usoof


Abstract
Sentiment analysis has witnessed significant advancements with the emergence of deep learning models such as transformer models. Transformer models adopt the mechanism of self-attention and have achieved state-of-the-art performance across various natural language processing (NLP) tasks, including sentiment analysis. However, limited studies are exploring the application of these recent advancements in sentiment analysis of Sinhala text. This study addresses this research gap by employing transformer models such as BERT, DistilBERT, RoBERTa, and XLM-RoBERTa (XLM-R) for sentiment analysis of Sinhala News comments. This study was conducted for 4 classes: positive, negative, neutral, and conflict, as well as for 3 classes: positive, negative, and neutral. It revealed that the XLM-R-large model outperformed the other four models, and the transformer models used in previous studies for the Sinhala language. The XLM-R-large model achieved an accuracy of 65.84% and a macro-F1 score of 62.04% for sentiment analysis with four classes and an accuracy of 75.90% and a macro-F1 score of 72.31% for three classes.
Anthology ID:
2025.indonlp-1.9
Volume:
Proceedings of the First Workshop on Natural Language Processing for Indo-Aryan and Dravidian Languages
Month:
January
Year:
2025
Address:
Abu Dhabi
Editors:
Ruvan Weerasinghe, Isuri Anuradha, Deshan Sumanathilaka
Venues:
IndoNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
74–82
Language:
URL:
https://aclanthology.org/2025.indonlp-1.9/
DOI:
Bibkey:
Cite (ACL):
Isuru Bandaranayake and Hakim Usoof. 2025. Sentiment Analysis of Sinhala News Comments Using Transformers. In Proceedings of the First Workshop on Natural Language Processing for Indo-Aryan and Dravidian Languages, pages 74–82, Abu Dhabi. Association for Computational Linguistics.
Cite (Informal):
Sentiment Analysis of Sinhala News Comments Using Transformers (Bandaranayake & Usoof, IndoNLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.indonlp-1.9.pdf