Team Bias Busters at WASSA 2023 Empathy, Emotion and Personality Shared Task: Emotion Detection with Generative Pretrained Transformers

Andrew Nedilko, Yi Chu


Abstract
This paper describes the approach that we used to take part in the multi-label multi-class emotion classification as Track 3 of the WASSA 2023 Empathy, Emotion and Personality Shared Task at ACL 2023. The overall goal of this track is to build models that can predict 8 classes (7 emotions + neutral) based on short English essays written in response to news article that talked about events perceived as harmful to people. We used OpenAI generative pretrained transformers with full-scale APIs for the emotion prediction task by fine-tuning a GPT-3 model and doing prompt engineering for zero-shot / few-shot learning with ChatGPT and GPT-4 models based on multiple experiments on the dev set. The most efficient method was fine-tuning a GPT-3 model which allowed us to beat our baseline character-based XGBoost Classifier and rank 2nd among all other participants by achieving a macro F1 score of 0.65 and a micro F1 score of 0.7 on the final blind test set.
Anthology ID:
2023.wassa-1.53
Volume:
Proceedings of the 13th Workshop on Computational Approaches to Subjectivity, Sentiment, & Social Media Analysis
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Jeremy Barnes, Orphée De Clercq, Roman Klinger
Venue:
WASSA
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
569–573
Language:
URL:
https://aclanthology.org/2023.wassa-1.53
DOI:
10.18653/v1/2023.wassa-1.53
Bibkey:
Cite (ACL):
Andrew Nedilko and Yi Chu. 2023. Team Bias Busters at WASSA 2023 Empathy, Emotion and Personality Shared Task: Emotion Detection with Generative Pretrained Transformers. In Proceedings of the 13th Workshop on Computational Approaches to Subjectivity, Sentiment, & Social Media Analysis, pages 569–573, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Team Bias Busters at WASSA 2023 Empathy, Emotion and Personality Shared Task: Emotion Detection with Generative Pretrained Transformers (Nedilko & Chu, WASSA 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.wassa-1.53.pdf
Video:
 https://aclanthology.org/2023.wassa-1.53.mp4