Transformers and Large Language Models for Hope Speech Detection A Multilingual Approach

Diana Patricia Madera-Espíndola, Zoe Caballero-Domínguez, Valeria J. Ramírez-Macías, Sabur Butt, Hector G. Ceballos


Abstract
With the rise of Generative AI (GenAI) models in recent years, it is necessary to understand how they performed compared with other Deep Learning techniques, across tasks and across different languages. In this study, we benchmark ChatGPT-4 and XML-RoBERTa, a multilingual transformer-based model, as part of the Multilingual Binary and Multiclass Hope Speech Detection within the PolyHope-M 2025 competition. Furthermore, we explored prompting techniques and data augmentation to determine which approach yields the best performance. In our experiments, XML-RoBERTa frequently outperformed ChatGPT-4. It also attained F1 scores of 0.86 for English, 0.83 for Spanish, 0.86 for German, and 0.94 for Urdu in Task 1, while achieving 0.73 for English, 0.70 for Spanish, 0.69 for German, and 0.60 for Urdu in Task 2.
Anthology ID:
2025.r2lm-1.8
Volume:
Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Alicia Picazo-Izquierdo, Ernesto Luis Estevanell-Valladares, Ruslan Mitkov, Rafael Muñoz Guillena, Raúl García Cerdá
Venues:
R2LM | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
67–76
Language:
URL:
https://aclanthology.org/2025.r2lm-1.8/
DOI:
Bibkey:
Cite (ACL):
Diana Patricia Madera-Espíndola, Zoe Caballero-Domínguez, Valeria J. Ramírez-Macías, Sabur Butt, and Hector G. Ceballos. 2025. Transformers and Large Language Models for Hope Speech Detection A Multilingual Approach. In Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models, pages 67–76, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Transformers and Large Language Models for Hope Speech Detection A Multilingual Approach (Madera-Espíndola et al., R2LM 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.r2lm-1.8.pdf