PFA-ERC: Psuedo-Future Augmented Dynamic Emotion Recognition in Conversations

Tanmay Khule, Rishabh Agrawal, Apurva Narayan


Abstract
AI systems’ ability to interpret human emotions and adapt to variations is becoming more crucial as AI gets embedded into everyone’s daily lives. Emotion Recognition in Conversations (ERC) is based on this fundamental challenge. Current state-of-the-art technologies in ERC are limited due to the need for future information. We introduce High-Dimensional Temporal Fusion Transformer (HiTFT), a time-series forecasting transformer that predicts pseudo-future information to overcome this constraint. This retains the models’ dynamic nature and provides future information more efficiently than other methods. Our proposed method combines pseudo future embeddings with an encoder that models the speaker’s emotional state using past and pseudo-future information as well as inter and intra speaker interactions; these speaker states are then passed through a decoder block that predicts the inferred emotion of that utterance. We further evaluate our method and show that it achieves state of the art performance on three ERC datasets - MELD, EmoryNLP, and IEMOCap.
Anthology ID:
2024.findings-emnlp.950
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16196–16207
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.950
DOI:
Bibkey:
Cite (ACL):
Tanmay Khule, Rishabh Agrawal, and Apurva Narayan. 2024. PFA-ERC: Psuedo-Future Augmented Dynamic Emotion Recognition in Conversations. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 16196–16207, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
PFA-ERC: Psuedo-Future Augmented Dynamic Emotion Recognition in Conversations (Khule et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.950.pdf