%0 Conference Proceedings %T CoLV: A Collaborative Latent Variable Model for Knowledge-Grounded Dialogue Generation %A Zhan, Haolan %A Shen, Lei %A Chen, Hongshen %A Zhang, Hainan %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F zhan-etal-2021-colv %X Knowledge-grounded dialogue generation has achieved promising performance with the engagement of external knowledge sources. Typical approaches towards this task usually perform relatively independent two sub-tasks, i.e., knowledge selection and knowledge-aware response generation. In this paper, in order to improve the diversity of both knowledge selection and knowledge-aware response generation, we propose a collaborative latent variable (CoLV) model to integrate these two aspects simultaneously in separate yet collaborative latent spaces, so as to capture the inherent correlation between knowledge selection and response generation. During generation, our proposed model firstly draws knowledge candidate from the latent space conditioned on the dialogue context, and then samples a response from another collaborative latent space conditioned on both the context and the selected knowledge. Experimental results on two widely-used knowledge-grounded dialogue datasets show that our model outperforms previous methods on both knowledge selection and response generation. %R 10.18653/v1/2021.emnlp-main.172 %U https://aclanthology.org/2021.emnlp-main.172 %U https://doi.org/10.18653/v1/2021.emnlp-main.172 %P 2250-2261