Adversarial Learning on the Latent Space for Diverse Dialog Generation

Kashif Khan, Gaurav Sahu, Vikash Balasubramanian, Lili Mou, Olga Vechtomova


Abstract
Generating relevant responses in a dialog is challenging, and requires not only proper modeling of context in the conversation, but also being able to generate fluent sentences during inference. In this paper, we propose a two-step framework based on generative adversarial nets for generating conditioned responses. Our model first learns a meaningful representation of sentences by autoencoding, and then learns to map an input query to the response representation, which is in turn decoded as a response sentence. Both quantitative and qualitative evaluations show that our model generates more fluent, relevant, and diverse responses than existing state-of-the-art methods.
Anthology ID:
2020.coling-main.441
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5026–5034
Language:
URL:
https://aclanthology.org/2020.coling-main.441
DOI:
10.18653/v1/2020.coling-main.441
Bibkey:
Cite (ACL):
Kashif Khan, Gaurav Sahu, Vikash Balasubramanian, Lili Mou, and Olga Vechtomova. 2020. Adversarial Learning on the Latent Space for Diverse Dialog Generation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5026–5034, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Adversarial Learning on the Latent Space for Diverse Dialog Generation (Khan et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.441.pdf
Code
 vikigenius/conditional_text_generation
Data
DailyDialog