Exploiting Unsupervised Data for Emotion Recognition in Conversations

Wenxiang Jiao, Michael Lyu, Irwin King


Abstract
Emotion Recognition in Conversations (ERC) aims to predict the emotional state of speakers in conversations, which is essentially a text classification task. Unlike the sentence-level text classification problem, the available supervised data for the ERC task is limited, which potentially prevents the models from playing their maximum effect. In this paper, we propose a novel approach to leverage unsupervised conversation data, which is more accessible. Specifically, we propose the Conversation Completion (ConvCom) task, which attempts to select the correct answer from candidate answers to fill a masked utterance in a conversation. Then, we Pre-train a basic COntext-Dependent Encoder (Pre-CODE) on the ConvCom task. Finally, we fine-tune the Pre-CODE on the datasets of ERC. Experimental results demonstrate that pre-training on unsupervised data achieves significant improvement of performance on the ERC datasets, particularly on the minority emotion classes.
Anthology ID:
2020.findings-emnlp.435
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4839–4846
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.435
DOI:
10.18653/v1/2020.findings-emnlp.435
Bibkey:
Cite (ACL):
Wenxiang Jiao, Michael Lyu, and Irwin King. 2020. Exploiting Unsupervised Data for Emotion Recognition in Conversations. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4839–4846, Online. Association for Computational Linguistics.
Cite (Informal):
Exploiting Unsupervised Data for Emotion Recognition in Conversations (Jiao et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.435.pdf
Code
 wxjiao/Pre-CODE
Data
EmotionLinesIEMOCAP