Think Before You Speak: Learning to Generate Implicit Knowledge for Response Generation by Self-Talk

Pei Zhou, Behnam Hedayatnia, Karthik Gopalakrishnan, Seokhwan Kim, Jay Pujara, Xiang Ren, Yang Liu, Dilek Hakkani-Tur


Abstract
Humans make appropriate responses not only based on previous dialogue utterances but also on implicit background knowledge such as common sense. Although neural response generation models seem to produce human-like responses, they are mostly end-to-end and not generating intermediate grounds between a dialogue history and responses. This work aims to study if and how we can train an RG model that talks with itself to generate implicit knowledge before making responses. We further investigate can such models identify when to generate implicit background knowledge and when it is not necessary. Experimental results show that compared with models that directly generate responses given a dialogue history, self-talk models produce better-quality responses according to human evaluation on grammaticality, coherence, and engagingness. And models that are trained to identify when to self-talk further improves the response quality. Analysis on generated implicit knowledge shows that models mostly use the knowledge appropriately in the responses.
Anthology ID:
2021.nlp4convai-1.23
Volume:
Proceedings of the 3rd Workshop on Natural Language Processing for Conversational AI
Month:
November
Year:
2021
Address:
Online
Editors:
Alexandros Papangelis, Paweł Budzianowski, Bing Liu, Elnaz Nouri, Abhinav Rastogi, Yun-Nung Chen
Venue:
NLP4ConvAI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
251–253
Language:
URL:
https://aclanthology.org/2021.nlp4convai-1.23
DOI:
10.18653/v1/2021.nlp4convai-1.23
Bibkey:
Cite (ACL):
Pei Zhou, Behnam Hedayatnia, Karthik Gopalakrishnan, Seokhwan Kim, Jay Pujara, Xiang Ren, Yang Liu, and Dilek Hakkani-Tur. 2021. Think Before You Speak: Learning to Generate Implicit Knowledge for Response Generation by Self-Talk. In Proceedings of the 3rd Workshop on Natural Language Processing for Conversational AI, pages 251–253, Online. Association for Computational Linguistics.
Cite (Informal):
Think Before You Speak: Learning to Generate Implicit Knowledge for Response Generation by Self-Talk (Zhou et al., NLP4ConvAI 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.nlp4convai-1.23.pdf
Data
ConceptNetMuTual