KETOD: Knowledge-Enriched Task-Oriented Dialogue

Zhiyu Chen, Bing Liu, Seungwhan Moon, Chinnadhurai Sankar, Paul Crook, William Yang Wang


Abstract
Existing studies in dialogue system research mostly treat task-oriented dialogue and chit-chat as separate domains. Towards building a human-like assistant that can converse naturally and seamlessly with users, it is important to build a dialogue system that conducts both types of conversations effectively. In this work, we investigate how task-oriented dialogue and knowledge-grounded chit-chat can be effectively integrated into a single model. To this end, we create a new dataset, KETOD (Knowledge-Enriched Task-Oriented Dialogue), where we naturally enrich task-oriented dialogues with chit-chat based on relevant entity knowledge. We also propose two new models, SimpleToDPlus and Combiner, for the proposed task. Experimental results on both automatic and human evaluations show that the proposed methods can significantly improve the performance in knowledge-enriched response generation while maintaining a competitive task-oriented dialog performance. We believe our new dataset will be a valuable resource for future studies. Our dataset and code are publicly available at https://github.com/facebookresearch/ketod.
Anthology ID:
2022.findings-naacl.197
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Venues:
Findings | NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2581–2593
Language:
URL:
https://aclanthology.org/2022.findings-naacl.197
DOI:
10.18653/v1/2022.findings-naacl.197
Bibkey:
Cite (ACL):
Zhiyu Chen, Bing Liu, Seungwhan Moon, Chinnadhurai Sankar, Paul Crook, and William Yang Wang. 2022. KETOD: Knowledge-Enriched Task-Oriented Dialogue. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2581–2593, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
KETOD: Knowledge-Enriched Task-Oriented Dialogue (Chen et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.197.pdf
Data
SGD