Diversifying Neural Dialogue Generation via Negative Distillation

Yiwei Li, Shaoxiong Feng, Bin Sun, Kan Li


Abstract
Generative dialogue models suffer badly from the generic response problem, limiting their applications to a few toy scenarios. Recently, an interesting approach, namely negative training, has been proposed to alleviate this problem by reminding the model not to generate high-frequency responses during training. However, its performance is hindered by two issues, ignoring low-frequency but generic responses and bringing low-frequency but meaningless responses. In this paper, we propose a novel negative training paradigm, called negative distillation, to keep the model away from the undesirable generic responses while avoiding the above problems. First, we introduce a negative teacher model that can produce query-wise generic responses, and then the student model is required to maximize the distance with multi-level negative knowledge. Empirical results show that our method outperforms previous negative training methods significantly.
Anthology ID:
2022.naacl-main.31
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
407–418
Language:
URL:
https://aclanthology.org/2022.naacl-main.31
DOI:
10.18653/v1/2022.naacl-main.31
Bibkey:
Cite (ACL):
Yiwei Li, Shaoxiong Feng, Bin Sun, and Kan Li. 2022. Diversifying Neural Dialogue Generation via Negative Distillation. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 407–418, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Diversifying Neural Dialogue Generation via Negative Distillation (Li et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.31.pdf
Video:
 https://aclanthology.org/2022.naacl-main.31.mp4
Data
DailyDialog