Retrieval-Free Knowledge-Grounded Dialogue Response Generation with Adapters

Yan Xu, Etsuko Ishii, Samuel Cahyawijaya, Zihan Liu, Genta Indra Winata, Andrea Madotto, Dan Su, Pascale Fung


Abstract
To diversify and enrich generated dialogue responses, knowledge-grounded dialogue has been investigated in recent years. The existing methods tackle the knowledge grounding challenge by retrieving the relevant sentences over a large corpus and augmenting the dialogues with explicit extra information. Despite their success, however, the existing works have drawbacks on the inference efficiency. This paper proposes KnowExpert, an end-to-end framework to bypass the explicit retrieval process and inject knowledge into the pre-trained language models with lightweight adapters and adapt to the knowledge-grounded dialogue task. To the best of our knowledge, this is the first attempt to tackle this challenge without retrieval in this task under an open-domain chit-chat scenario. The experimental results show that KnowExpert performs comparably with some retrieval-based baselines while being time-efficient in inference, demonstrating the effectiveness of our proposed method.
Anthology ID:
2022.dialdoc-1.10
Volume:
Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venues:
ACL | dialdoc
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
93–107
Language:
URL:
https://aclanthology.org/2022.dialdoc-1.10
DOI:
10.18653/v1/2022.dialdoc-1.10
Bibkey:
Cite (ACL):
Yan Xu, Etsuko Ishii, Samuel Cahyawijaya, Zihan Liu, Genta Indra Winata, Andrea Madotto, Dan Su, and Pascale Fung. 2022. Retrieval-Free Knowledge-Grounded Dialogue Response Generation with Adapters. In Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering, pages 93–107, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Retrieval-Free Knowledge-Grounded Dialogue Response Generation with Adapters (Xu et al., dialdoc 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.dialdoc-1.10.pdf
Code
 hltchkust/knowexpert
Data
Wizard of Wikipedia