Dialogue Response Generation via Contrastive Latent Representation Learning

Shuyang Dai, Guoyin Wang, Sunghyun Park, Sungjin Lee


Abstract
Large-scale auto-regressive models have achieved great success in dialogue response generation, with the help of Transformer layers. However, these models do not learn a representative latent space of the sentence distribution, making it hard to control the generation. Recent works have tried on learning sentence representations using Transformer-based framework, but do not model the context-response relationship embedded in the dialogue datasets. In this work, we aim to construct a robust sentence representation learning model, that is specifically designed for dialogue response generation, with Transformer-based encoder-decoder structure. An utterance-level contrastive learning is proposed, encoding predictive information in each context representation for its corresponding response. Extensive experiments are conducted to verify the robustness of the proposed representation learning mechanism. By using both reference-based and reference-free evaluation metrics, we provide detailed analysis on the generated sentences, demonstrating the effectiveness of our proposed model.
Anthology ID:
2021.nlp4convai-1.18
Volume:
Proceedings of the 3rd Workshop on Natural Language Processing for Conversational AI
Month:
November
Year:
2021
Address:
Online
Editors:
Alexandros Papangelis, Paweł Budzianowski, Bing Liu, Elnaz Nouri, Abhinav Rastogi, Yun-Nung Chen
Venue:
NLP4ConvAI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
189–197
Language:
URL:
https://aclanthology.org/2021.nlp4convai-1.18
DOI:
10.18653/v1/2021.nlp4convai-1.18
Bibkey:
Cite (ACL):
Shuyang Dai, Guoyin Wang, Sunghyun Park, and Sungjin Lee. 2021. Dialogue Response Generation via Contrastive Latent Representation Learning. In Proceedings of the 3rd Workshop on Natural Language Processing for Conversational AI, pages 189–197, Online. Association for Computational Linguistics.
Cite (Informal):
Dialogue Response Generation via Contrastive Latent Representation Learning (Dai et al., NLP4ConvAI 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.nlp4convai-1.18.pdf
Data
DailyDialog