Shapes of Emotions: Multimodal Emotion Recognition in Conversations via Emotion Shifts

Keshav Bansal, Harsh Agarwal, Abhinav Joshi, Ashutosh Modi


Abstract
Emotion Recognition in Conversations (ERC) is an important and active research area. Recent work has shown the benefits of using multiple modalities (e.g., text, audio, and video) for the ERC task. In a conversation, participants tend to maintain a particular emotional state unless some stimuli evokes a change. There is a continuous ebb and flow of emotions in a conversation. Inspired by this observation, we propose a multimodal ERC model and augment it with an emotion-shift component that improves performance. The proposed emotion-shift component is modular and can be added to any existing multimodal ERC model (with a few modifications). We experiment with different variants of the model, and results show that the inclusion of emotion shift signal helps the model to outperform existing models for ERC on MOSEI and IEMOCAP datasets.
Anthology ID:
2022.mmmpie-1.6
Original:
2022.mmmpie-1.6v1
Version 2:
2022.mmmpie-1.6v2
Volume:
Proceedings of the First Workshop on Performance and Interpretability Evaluations of Multimodal, Multipurpose, Massive-Scale Models
Month:
October
Year:
2022
Address:
Virtual
Venue:
MMMPIE
SIG:
Publisher:
International Conference on Computational Linguistics
Note:
Pages:
44–56
Language:
URL:
https://aclanthology.org/2022.mmmpie-1.6
DOI:
Bibkey:
Cite (ACL):
Keshav Bansal, Harsh Agarwal, Abhinav Joshi, and Ashutosh Modi. 2022. Shapes of Emotions: Multimodal Emotion Recognition in Conversations via Emotion Shifts. In Proceedings of the First Workshop on Performance and Interpretability Evaluations of Multimodal, Multipurpose, Massive-Scale Models, pages 44–56, Virtual. International Conference on Computational Linguistics.
Cite (Informal):
Shapes of Emotions: Multimodal Emotion Recognition in Conversations via Emotion Shifts (Bansal et al., MMMPIE 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.mmmpie-1.6.pdf
Code
 exploration-lab/shapes-of-emotion
Data
IEMOCAP