Conversational Emotion-Cause Pair Extraction with Guided Mixture of Experts

DongJin Jeong, JinYeong Bak


Abstract
Emotion-Cause Pair Extraction (ECPE) task aims to pair all emotions and corresponding causes in documents.ECPE is an important task for developing human-like responses. However, previous ECPE research is conducted based on news articles, which has different characteristics compared to dialogues. To address this issue, we propose a Pair-Relationship Guided Mixture-of-Experts (PRG-MoE) model, which considers dialogue features (e.g., speaker information).PRG-MoE automatically learns relationship between utterances and advises a gating network to incorporate dialogue features in the evaluation, yielding substantial performance improvement. We employ a new ECPE dataset, which is an English dialogue dataset, with more emotion-cause pairs in documents than news articles. We also propose Cause Type Classification that classifies emotion-cause pairs according to the types of the cause of a detected emotion. For reproducing the results, we make available all our code and data.
Anthology ID:
2023.eacl-main.240
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3288–3298
Language:
URL:
https://aclanthology.org/2023.eacl-main.240
DOI:
10.18653/v1/2023.eacl-main.240
Bibkey:
Cite (ACL):
DongJin Jeong and JinYeong Bak. 2023. Conversational Emotion-Cause Pair Extraction with Guided Mixture of Experts. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3288–3298, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Conversational Emotion-Cause Pair Extraction with Guided Mixture of Experts (Jeong & Bak, EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.240.pdf
Video:
 https://aclanthology.org/2023.eacl-main.240.mp4