Context or Knowledge is Not Always Necessary: A Contrastive Learning Framework for Emotion Recognition in Conversations

Geng Tu, Bin Liang, Ruibin Mao, Min Yang, Ruifeng Xu


Abstract
Emotion recognition in conversations (ERC) aims to detect the emotion of utterances in conversations. Existing efforts generally focus on modeling context- and knowledge-sensitive dependencies. However, it is observed that the emotions of many utterances can be correctly detected without context or external knowledge. In such cases, blindly leveraging the context and external knowledge may impede model training. Based on this, we propose a novel framework based on contrastive learning (CL), called CKCL (including the contrastive learning scenarios among Context and Knowledge), to distinguish the above utterances for better vector representations. The CKCL framework defines context- and knowledge-independent utterances, as the positive sample, whose predicted results are unchanged even masking context and knowledge representations, otherwise, the negative sample. This can obtain a latent feature reflecting the impact degree of context and external knowledge on predicted results, thus effectively denoising irrelevant context and knowledge during training. Experimental results on four datasets show the performance of CKCL-based models is significantly boosted and outperforms state-of-the-art methods.
Anthology ID:
2023.findings-acl.883
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14054–14067
Language:
URL:
https://aclanthology.org/2023.findings-acl.883
DOI:
10.18653/v1/2023.findings-acl.883
Bibkey:
Cite (ACL):
Geng Tu, Bin Liang, Ruibin Mao, Min Yang, and Ruifeng Xu. 2023. Context or Knowledge is Not Always Necessary: A Contrastive Learning Framework for Emotion Recognition in Conversations. In Findings of the Association for Computational Linguistics: ACL 2023, pages 14054–14067, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Context or Knowledge is Not Always Necessary: A Contrastive Learning Framework for Emotion Recognition in Conversations (Tu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.883.pdf