Canbin Huang
2023
Retrieval-Generation Alignment for End-to-End Task-Oriented Dialogue System
Weizhou Shen
|
Yingqi Gao
|
Canbin Huang
|
Fanqi Wan
|
Xiaojun Quan
|
Wei Bi
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Developing an efficient retriever to retrieve knowledge from a large-scale knowledge base (KB) is critical for task-oriented dialogue systems to effectively handle localized and specialized tasks. However, widely used generative models such as T5 and ChatGPT often struggle to differentiate subtle differences among the retrieved KB records when generating responses, resulting in suboptimal quality of generated responses. In this paper, we propose the application of maximal marginal likelihood to train a perceptive retriever by utilizing signals from response generation for supervision. In addition, our approach goes beyond considering solely retrieved entities and incorporates various meta knowledge to guide the generator, thus improving the utilization of knowledge. We evaluate our approach on three task-oriented dialogue datasets using T5 and ChatGPT as the backbone models. The results demonstrate that when combined with meta knowledge, the response generator can effectively leverage high-quality knowledge records from the retriever and enhance the quality of generated responses. The code of this work is available at https://github.com/shenwzh3/MK-TOD.
Search