Neural Stylistic Response Generation with Disentangled Latent Variables

Qingfu Zhu, Wei-Nan Zhang, Ting Liu, William Yang Wang


Abstract
Generating open-domain conversational responses in the desired style usually suffers from the lack of parallel data in the style. Meanwhile, using monolingual stylistic data to increase style intensity often leads to the expense of decreasing content relevance. In this paper, we propose to disentangle the content and style in latent space by diluting sentence-level information in style representations. Combining the desired style representation and a response content representation will then obtain a stylistic response. Our approach achieves a higher BERT-based style intensity score and comparable BLEU scores, compared with baselines. Human evaluation results show that our approach significantly improves style intensity and maintains content relevance.
Anthology ID:
2021.acl-long.339
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4391–4401
Language:
URL:
https://aclanthology.org/2021.acl-long.339
DOI:
10.18653/v1/2021.acl-long.339
Bibkey:
Cite (ACL):
Qingfu Zhu, Wei-Nan Zhang, Ting Liu, and William Yang Wang. 2021. Neural Stylistic Response Generation with Disentangled Latent Variables. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 4391–4401, Online. Association for Computational Linguistics.
Cite (Informal):
Neural Stylistic Response Generation with Disentangled Latent Variables (Zhu et al., ACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.339.pdf
Video:
 https://aclanthology.org/2021.acl-long.339.mp4
Data
DailyDialog