Semi-Supervised Knowledge-Grounded Pre-training for Task-Oriented Dialog Systems

Weihao Zeng, Keqing He, Zechen Wang, Dayuan Fu, Guanting Dong, Ruotong Geng, Pei Wang, Jingang Wang, Chaobo Sun, Wei Wu, Weiran Xu


Abstract
Recent advances in neural approaches greatly improve task-oriented dialogue (TOD) systems which assist users to accomplish their goals. However, such systems rely on costly manually labeled dialogs which are not available in practical scenarios. In this paper, we present our models for Track 2 of the SereTOD 2022 challenge, which is the first challenge of building semisupervised and reinforced TOD systems on a large-scale real-world Chinese TOD dataset MobileCS. We build a knowledge-grounded dialog model to formulate dialog history and local KB as input and predict the system response. And we perform semi-supervised pretraining both on the labeled and unlabeled data. Our system achieves the first place both in the automatic evaluation and human interaction, especially with higher BLEU (+7.64) and Success (+13.6%) than the second place.
Anthology ID:
2022.seretod-1.6
Volume:
Proceedings of the Towards Semi-Supervised and Reinforced Task-Oriented Dialog Systems (SereTOD)
Month:
December
Year:
2022
Address:
Abu Dhabi, Beijing (Hybrid)
Editors:
Zhijian Ou, Junlan Feng, Juanzi Li
Venue:
SereTOD
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
39–47
Language:
URL:
https://aclanthology.org/2022.seretod-1.6
DOI:
10.18653/v1/2022.seretod-1.6
Bibkey:
Cite (ACL):
Weihao Zeng, Keqing He, Zechen Wang, Dayuan Fu, Guanting Dong, Ruotong Geng, Pei Wang, Jingang Wang, Chaobo Sun, Wei Wu, and Weiran Xu. 2022. Semi-Supervised Knowledge-Grounded Pre-training for Task-Oriented Dialog Systems. In Proceedings of the Towards Semi-Supervised and Reinforced Task-Oriented Dialog Systems (SereTOD), pages 39–47, Abu Dhabi, Beijing (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Semi-Supervised Knowledge-Grounded Pre-training for Task-Oriented Dialog Systems (Zeng et al., SereTOD 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.seretod-1.6.pdf
Video:
 https://aclanthology.org/2022.seretod-1.6.mp4