United We Fine-Tune: Structurally Complementary Datasets for Hope Speech Detection

Priya Dharshini Krishnaraj, Tulio Ferreira Leite da Silva, Gonzalo Freijedo Aduna, Samuel Chen, Farah Benamara, Alda Mari


Abstract
We propose a fine-tuning strategy for English Multi-class Hope Speech Detection using Mistral, leveraging two complementary datasets: PolyHope and CDB, a new unified framework for hope speech detection. While the former provides nuanced hope-related categories such as GENERALIZED, REALISTIC, and UNREALISTIC HOPE, the later introduces linguistically grounded dimensions including COUNTERFACTUAL, DESIRE, and BELIEF. By fine-tuning Mistral on both datasets, we enable the model to capture deeper semantic representations of hope. In addition to fine-tuning, we developed advanced prompting strategies which provide interpretable, zero-shot alternatives and further inform annotation and classification designs. Our approach achieved third place in the multi-class (Macro F1=71.77) and sixth in the binary (Macro F1=85.35) settings.
Anthology ID:
2025.r2lm-1.6
Volume:
Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Alicia Picazo-Izquierdo, Ernesto Luis Estevanell-Valladares, Ruslan Mitkov, Rafael Muñoz Guillena, Raúl García Cerdá
Venues:
R2LM | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
48–58
Language:
URL:
https://aclanthology.org/2025.r2lm-1.6/
DOI:
Bibkey:
Cite (ACL):
Priya Dharshini Krishnaraj, Tulio Ferreira Leite da Silva, Gonzalo Freijedo Aduna, Samuel Chen, Farah Benamara, and Alda Mari. 2025. United We Fine-Tune: Structurally Complementary Datasets for Hope Speech Detection. In Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models, pages 48–58, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
United We Fine-Tune: Structurally Complementary Datasets for Hope Speech Detection (Krishnaraj et al., R2LM 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.r2lm-1.6.pdf