On Controlling Fallback Responses for Grounded Dialogue Generation

Hongyuan Lu, Wai Lam, Hong Cheng, Helen Meng


Abstract
Dialogue agents can leverage external textual knowledge to generate responses of a higher quality. To our best knowledge, most existing works on knowledge grounded dialogue settings assume that the user intention is always answerable. Unfortunately, this is impractical as there is no guarantee that the knowledge retrievers could always retrieve the desired knowledge. Therefore, this is crucial to incorporate fallback responses to respond to unanswerable contexts appropriately while responding to the answerable contexts in an informative manner. We propose a novel framework that automatically generates a control token with the generator to bias the succeeding response towards informativeness for answerable contexts and fallback for unanswerable contexts in an end-to-end manner. Since no existing knowledge grounded dialogue dataset considers this aim, we augment the existing dataset with unanswerable contexts to conduct our experiments. Automatic and human evaluation results indicate that naively incorporating fallback responses with controlled text generation still hurts informativeness for answerable context. In contrast, our proposed framework effectively mitigates this problem while still appropriately presenting fallback responses to unanswerable contexts. Such a framework also reduces the extra burden of the additional classifier and the overheads introduced in the previous works, which operates in a pipeline manner.
Anthology ID:
2022.findings-acl.204
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venues:
ACL | Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2591–2601
Language:
URL:
https://aclanthology.org/2022.findings-acl.204
DOI:
10.18653/v1/2022.findings-acl.204
Bibkey:
Cite (ACL):
Hongyuan Lu, Wai Lam, Hong Cheng, and Helen Meng. 2022. On Controlling Fallback Responses for Grounded Dialogue Generation. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2591–2601, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
On Controlling Fallback Responses for Grounded Dialogue Generation (Lu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.204.pdf
Software:
 2022.findings-acl.204.software.zip