CS-BERT: a pretrained model for customer service dialogues

Peiyao Wang, Joyce Fang, Julia Reinspach


Abstract
Large-scale pretrained transformer models have demonstrated state-of-the-art (SOTA) performance in a variety of NLP tasks. Nowadays, numerous pretrained models are available in different model flavors and different languages, and can be easily adapted to one’s downstream task. However, only a limited number of models are available for dialogue tasks, and in particular, goal-oriented dialogue tasks. In addition, the available pretrained models are trained on general domain language, creating a mismatch between the pretraining language and the downstream domain launguage. In this contribution, we present CS-BERT, a BERT model pretrained on millions of dialogues in the customer service domain. We evaluate CS-BERT on several downstream customer service dialogue tasks, and demonstrate that our in-domain pretraining is advantageous compared to other pretrained models in both zero-shot experiments as well as in finetuning experiments, especially in a low-resource data setting.
Anthology ID:
2021.nlp4convai-1.13
Volume:
Proceedings of the 3rd Workshop on Natural Language Processing for Conversational AI
Month:
November
Year:
2021
Address:
Online
Editors:
Alexandros Papangelis, Paweł Budzianowski, Bing Liu, Elnaz Nouri, Abhinav Rastogi, Yun-Nung Chen
Venue:
NLP4ConvAI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
130–142
Language:
URL:
https://aclanthology.org/2021.nlp4convai-1.13
DOI:
10.18653/v1/2021.nlp4convai-1.13
Bibkey:
Cite (ACL):
Peiyao Wang, Joyce Fang, and Julia Reinspach. 2021. CS-BERT: a pretrained model for customer service dialogues. In Proceedings of the 3rd Workshop on Natural Language Processing for Conversational AI, pages 130–142, Online. Association for Computational Linguistics.
Cite (Informal):
CS-BERT: a pretrained model for customer service dialogues (Wang et al., NLP4ConvAI 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.nlp4convai-1.13.pdf