Chenyi Liang


2024

pdf bib
YNU-HPCC at SemEval-2024 Task10: Pre-trained Language Model for Emotion Discovery and Reasoning its Flip in Conversation
Chenyi Liang | Jin Wang | Xuejie Zhang
Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024)

This paper describes the application of fine-tuning pre-trained models for SemEval-2024 Task 10: Emotion Discovery and Reasoning its Flip in Conversation (EDiReF), which requires the prediction of emotions for each utterance in a conversation and the identification of sentences where an emotional flip occurs. This model is built on the DeBERTa transformer model and enhanced for emotion detection and flip reasoning in conversations. It employs specific separators for utterance processing and utilizes specific padding to handle variable-length inputs. Methods such as R-drop, back translation, and focalloss are also employed in the training of my model. The model achieved specific results on the competition’s official leaderboard. The code of this paper is available athttps://github.com/jiaowoobjiuhao/SemEval-2024-task10.