A Randomized Link Transformer for Diverse Open-Domain Dialogue Generation

Jing Yang Lee, Kong Aik Lee, Woon Seng Gan


Abstract
A major issue in open-domain dialogue generation is the agent’s tendency to generate repetitive and generic responses. The lack in response diversity has been addressed in recent years via the use of latent variable models, such as the Conditional Variational Auto-Encoder (CVAE), which typically involve learning a latent Gaussian distribution over potential response intents. However, due to latent variable collapse, training latent variable dialogue models are notoriously complex, requiring substantial modification to the standard training process and loss function. Other approaches proposed to improve response diversity also largely entail a significant increase in training complexity. Hence, this paper proposes a Randomized Link (RL) Transformer as an alternative to the latent variable models. The RL Transformer does not require any additional enhancements to the training process or loss function. Empirical results show that, when it comes to response diversity, the RL Transformer achieved comparable performance compared to latent variable models.
Anthology ID:
2022.nlp4convai-1.1
Volume:
Proceedings of the 4th Workshop on NLP for Conversational AI
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Bing Liu, Alexandros Papangelis, Stefan Ultes, Abhinav Rastogi, Yun-Nung Chen, Georgios Spithourakis, Elnaz Nouri, Weiyan Shi
Venue:
NLP4ConvAI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–11
Language:
URL:
https://aclanthology.org/2022.nlp4convai-1.1
DOI:
10.18653/v1/2022.nlp4convai-1.1
Bibkey:
Cite (ACL):
Jing Yang Lee, Kong Aik Lee, and Woon Seng Gan. 2022. A Randomized Link Transformer for Diverse Open-Domain Dialogue Generation. In Proceedings of the 4th Workshop on NLP for Conversational AI, pages 1–11, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
A Randomized Link Transformer for Diverse Open-Domain Dialogue Generation (Lee et al., NLP4ConvAI 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.nlp4convai-1.1.pdf
Video:
 https://aclanthology.org/2022.nlp4convai-1.1.mp4
Data
DailyDialog