Xiaomeng Huang
2023
Scalable-DSC: A Structural Template Prompt Approach to Scalable Dialogue State Correction
Haoxiang Su
|
Hongyan Xie
|
Hao Huang
|
Shuangyong Song
|
Ruiyu Fang
|
Xiaomeng Huang
|
Sijie Feng
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Dialogue state error correction has recently been proposed to correct wrong slot values in predicted dialogue states, thereby mitigating the error propagation problem for dialogue state tracking (DST). These approaches, though effective, are heavily intertwined with specific DST models, limiting their applicability to other DST models. To solve this problem, we propose Scalable Dialogue State Correction (Scalable-DSC), which can correct wrong slot values in the dialogue state predicted by any DST model. Specifically, we propose a Structural Template Prompt (STP) that converts predicted dialogue state from any DST models into a standardized natural language sequence as a part of the historical context, associates them with dialogue history information, and generates a corrected dialogue state sequence based on predefined template options. We further enhance Scalable-DSC by introducing two training strategies. The first employs a predictive state simulator to simulate the predicted dialogue states as the training data to enhance the generalization ability of the model. The second involves using the dialogue state predicted by DST as the training data, aiming at mitigating the inconsistent error type distribution between the training and inference. Experiments confirm that our model achieves state-of-the-art results on MultiWOZ 2.0-2.4.
Search
Co-authors
- Haoxiang Su 1
- Hongyan Xie 1
- Hao Huang 1
- Shuangyong Song 1
- Ruiyu Fang 1
- show all...