Seungeun Rho
2023
Efficient Latent Variable Modeling for Knowledge-Grounded Dialogue Generation
Gunsoo Han
|
Daejin Jo
|
Daniel Nam
|
Eunseop Yoon
|
Taehwan Kwon
|
Seungeun Rho
|
Kyoung-Woon On
|
Chang Yoo
|
Sungwoong Kim
Findings of the Association for Computational Linguistics: EMNLP 2023
Knowledge-grounded dialogue generation requires first retrieving appropriate external knowledge based on a conversational context and then generating a response grounded on the retrieved knowledge. In general, these two sequential modules, a knowledge retriever and a response generator, have been separately trained in a supervised manner. However, obtaining intermediate labels of the ground-truth knowledge is expensive, especially in open-domain conversations. Latent variable modeling avoids this need for the labels. In this paper, we propose an efficient algorithm for this latent variable modeling that is able to leverage a large amount of dialogue data. Rather than directly training the complex retriever, we adapt a query generator with an off-the-shelf retriever, and the query generator and response generator are simultaneously trained over the latent variable of query. Moreover, we employ lower bound of the evidence as a training objective and modify it to robustly perform the joint training. Experimental results on diverse knowledge-grounded dialogue datasets show that the proposed algorithm significantly outperforms the supervised learning algorithm even without the use of the annotated knowledge while maintaining efficiency and scalability.
Search
Co-authors
- Gunsoo Han 1
- Daejin Jo 1
- Daniel Nam 1
- Eunseop Yoon 1
- Taehwan Kwon 1
- show all...