TECHSSN1 at SemEval-2024 Task 10: Emotion Classification in Hindi-English Code-Mixed Dialogue using Transformer-based Models

Venkatasai Ojus Yenumulapalli, Pooja Premnath, Parthiban Mohankumar, Rajalakshmi Sivanaiah, Angel Deborah


Abstract
The increase in the popularity of code mixed languages has resulted in the need to engineer language models for the same . Unlike pure languages, code-mixed languages lack clear grammatical structures, leading to ambiguous sentence constructions. This ambiguity presents significant challenges for natural language processing tasks, including syntactic parsing, word sense disambiguation, and language identification. This paper focuses on emotion recognition of conversations in Hinglish, a mix of Hindi and English, as part of Task 10 of SemEval 2024. The proposed approach explores the usage of standard machine learning models like SVM, MNB and RF, and also BERT-based models for Hindi-English code-mixed data- namely, HingBERT, Hing mBERT and HingRoBERTa for subtask A.
Anthology ID:
2024.semeval-1.119
Volume:
Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Atul Kr. Ojha, A. Seza Doğruöz, Harish Tayyar Madabushi, Giovanni Da San Martino, Sara Rosenthal, Aiala Rosá
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
833–838
Language:
URL:
https://aclanthology.org/2024.semeval-1.119
DOI:
10.18653/v1/2024.semeval-1.119
Bibkey:
Cite (ACL):
Venkatasai Ojus Yenumulapalli, Pooja Premnath, Parthiban Mohankumar, Rajalakshmi Sivanaiah, and Angel Deborah. 2024. TECHSSN1 at SemEval-2024 Task 10: Emotion Classification in Hindi-English Code-Mixed Dialogue using Transformer-based Models. In Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024), pages 833–838, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
TECHSSN1 at SemEval-2024 Task 10: Emotion Classification in Hindi-English Code-Mixed Dialogue using Transformer-based Models (Yenumulapalli et al., SemEval 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.semeval-1.119.pdf
Supplementary material:
 2024.semeval-1.119.SupplementaryMaterial.txt
Supplementary material:
 2024.semeval-1.119.SupplementaryMaterial.zip