Multi-Referenced Training for Dialogue Response Generation

Tianyu Zhao, Tatsuya Kawahara


Abstract
In open-domain dialogue response generation, a dialogue context can be continued with diverse responses, and the dialogue models should capture such one-to-many relations. In this work, we first analyze the training objective of dialogue models from the view of Kullback-Leibler divergence (KLD) and show that the gap between the real world probability distribution and the single-referenced data’s probability distribution prevents the model from learning the one-to-many relations efficiently. Then we explore approaches to multi-referenced training in two aspects. Data-wise, we generate diverse pseudo references from a powerful pretrained model to build multi-referenced data that provides a better approximation of the real-world distribution. Model-wise, we propose to equip variational models with an expressive prior, named linear Gaussian model (LGM). Experimental results of automated evaluation and human evaluation show that the methods yield significant improvements over baselines.
Anthology ID:
2021.sigdial-1.20
Volume:
Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
July
Year:
2021
Address:
Singapore and Online
Editors:
Haizhou Li, Gina-Anne Levow, Zhou Yu, Chitralekha Gupta, Berrak Sisman, Siqi Cai, David Vandyke, Nina Dethlefs, Yan Wu, Junyi Jessy Li
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
190–201
Language:
URL:
https://aclanthology.org/2021.sigdial-1.20
DOI:
10.18653/v1/2021.sigdial-1.20
Bibkey:
Cite (ACL):
Tianyu Zhao and Tatsuya Kawahara. 2021. Multi-Referenced Training for Dialogue Response Generation. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 190–201, Singapore and Online. Association for Computational Linguistics.
Cite (Informal):
Multi-Referenced Training for Dialogue Response Generation (Zhao & Kawahara, SIGDIAL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.sigdial-1.20.pdf
Video:
 https://www.youtube.com/watch?v=PIZcPh7CGcI
Code
 ZHAOTING/dialog-processing
Data
DailyDialog