RSSN at Multilingual Counterspeech Generation: Leveraging Lightweight Transformers for Efficient and Context-Aware Counter-Narrative Generation

Ravindran V


Abstract
This paper presents a system for counter-speech generation, developed for the COLING 2025 shared task. By leveraging lightweight transformer models, DistilBART and T5-small, we optimize computational efficiency while maintaining strong performance. The work includes an in-depth analysis of a multilingual dataset, addressing hate speech instances across diverse languages and target groups. Through systematic error analysis, we identify challenges such as lack of specificity and context misinterpretation in generated counter-narratives. Evaluation metrics like BLEU, ROUGE, and BERTScore demonstrate the effectiveness of our approaches, while comparative insights highlight complementary strengths in fluency, contextual integration, and creativity. Future directions focus on enhancing preprocessing, integrating external knowledge sources, and improving scalability.
Anthology ID:
2025.mcg-1.2
Volume:
Proceedings of the First Workshop on Multilingual Counterspeech Generation
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Helena Bonaldi, María Estrella Vallecillo-Rodríguez, Irune Zubiaga, Arturo Montejo-Ráez, Aitor Soroa, María Teresa Martín-Valdivia, Marco Guerini, Rodrigo Agerri
Venues:
MCG | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13–18
Language:
URL:
https://aclanthology.org/2025.mcg-1.2/
DOI:
Bibkey:
Cite (ACL):
Ravindran V. 2025. RSSN at Multilingual Counterspeech Generation: Leveraging Lightweight Transformers for Efficient and Context-Aware Counter-Narrative Generation. In Proceedings of the First Workshop on Multilingual Counterspeech Generation, pages 13–18, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
RSSN at Multilingual Counterspeech Generation: Leveraging Lightweight Transformers for Efficient and Context-Aware Counter-Narrative Generation (V, MCG 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.mcg-1.2.pdf