%0 Conference Proceedings %T Neural Stylistic Response Generation with Disentangled Latent Variables %A Zhu, Qingfu %A Zhang, Wei-Nan %A Liu, Ting %A Wang, William Yang %Y Zong, Chengqing %Y Xia, Fei %Y Li, Wenjie %Y Navigli, Roberto %S Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) %D 2021 %8 August %I Association for Computational Linguistics %C Online %F zhu-etal-2021-neural %X Generating open-domain conversational responses in the desired style usually suffers from the lack of parallel data in the style. Meanwhile, using monolingual stylistic data to increase style intensity often leads to the expense of decreasing content relevance. In this paper, we propose to disentangle the content and style in latent space by diluting sentence-level information in style representations. Combining the desired style representation and a response content representation will then obtain a stylistic response. Our approach achieves a higher BERT-based style intensity score and comparable BLEU scores, compared with baselines. Human evaluation results show that our approach significantly improves style intensity and maintains content relevance. %R 10.18653/v1/2021.acl-long.339 %U https://aclanthology.org/2021.acl-long.339 %U https://doi.org/10.18653/v1/2021.acl-long.339 %P 4391-4401