Domain-Lifelong Learning for Dialogue State Tracking via Knowledge Preservation Networks

Qingbin Liu, Pengfei Cao, Cao Liu, Jiansong Chen, Xunliang Cai, Fan Yang, Shizhu He, Kang Liu, Jun Zhao


Abstract
Dialogue state tracking (DST), which estimates user goals given a dialogue context, is an essential component of task-oriented dialogue systems. Conventional DST models are usually trained offline, which requires a fixed dataset prepared in advance. This paradigm is often impractical in real-world applications since online dialogue systems usually involve continually emerging new data and domains. Therefore, this paper explores Domain-Lifelong Learning for Dialogue State Tracking (DLL-DST), which aims to continually train a DST model on new data to learn incessantly emerging new domains while avoiding catastrophically forgetting old learned domains. To this end, we propose a novel domain-lifelong learning method, called Knowledge Preservation Networks (KPN), which consists of multi-prototype enhanced retrospection and multi-strategy knowledge distillation, to solve the problems of expression diversity and combinatorial explosion in the DLL-DST task. Experimental results show that KPN effectively alleviates catastrophic forgetting and outperforms previous state-of-the-art lifelong learning methods by 4.25% and 8.27% of whole joint goal accuracy on the MultiWOZ benchmark and the SGD benchmark, respectively.
Anthology ID:
2021.emnlp-main.176
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2301–2311
Language:
URL:
https://aclanthology.org/2021.emnlp-main.176
DOI:
10.18653/v1/2021.emnlp-main.176
Bibkey:
Cite (ACL):
Qingbin Liu, Pengfei Cao, Cao Liu, Jiansong Chen, Xunliang Cai, Fan Yang, Shizhu He, Kang Liu, and Jun Zhao. 2021. Domain-Lifelong Learning for Dialogue State Tracking via Knowledge Preservation Networks. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2301–2311, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Domain-Lifelong Learning for Dialogue State Tracking via Knowledge Preservation Networks (Liu et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.176.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.176.mp4
Code
 liuqingbin/knowledge-preservation-networks
Data
SGD