Self-Supervised Contrastive Learning for Efficient User Satisfaction Prediction in Conversational Agents

Mohammad Kachuee, Hao Yuan, Young-Bum Kim, Sungjin Lee


Abstract
Turn-level user satisfaction is one of the most important performance metrics for conversational agents. It can be used to monitor the agent’s performance and provide insights about defective user experiences. While end-to-end deep learning has shown promising results, having access to a large number of reliable annotated samples required by these methods remains challenging. In a large-scale conversational system, there is a growing number of newly developed skills, making the traditional data collection, annotation, and modeling process impractical due to the required annotation costs and the turnaround times. In this paper, we suggest a self-supervised contrastive learning approach that leverages the pool of unlabeled data to learn user-agent interactions. We show that the pre-trained models using the self-supervised objective are transferable to the user satisfaction prediction. In addition, we propose a novel few-shot transfer learning approach that ensures better transferability for very small sample sizes. The suggested few-shot method does not require any inner loop optimization process and is scalable to very large datasets and complex models. Based on our experiments using real data from a large-scale commercial system, the suggested approach is able to significantly reduce the required number of annotations, while improving the generalization on unseen skills.
Anthology ID:
2021.naacl-main.319
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4053–4064
Language:
URL:
https://aclanthology.org/2021.naacl-main.319
DOI:
10.18653/v1/2021.naacl-main.319
Bibkey:
Cite (ACL):
Mohammad Kachuee, Hao Yuan, Young-Bum Kim, and Sungjin Lee. 2021. Self-Supervised Contrastive Learning for Efficient User Satisfaction Prediction in Conversational Agents. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4053–4064, Online. Association for Computational Linguistics.
Cite (Informal):
Self-Supervised Contrastive Learning for Efficient User Satisfaction Prediction in Conversational Agents (Kachuee et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.319.pdf
Video:
 https://aclanthology.org/2021.naacl-main.319.mp4