%0 Conference Proceedings %T Adaptive Parameterization for Neural Dialogue Generation %A Cai, Hengyi %A Chen, Hongshen %A Zhang, Cheng %A Song, Yonghao %A Zhao, Xiaofang %A Yin, Dawei %Y Inui, Kentaro %Y Jiang, Jing %Y Ng, Vincent %Y Wan, Xiaojun %S Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) %D 2019 %8 November %I Association for Computational Linguistics %C Hong Kong, China %F cai-etal-2019-adaptive %X Neural conversation systems generate responses based on the sequence-to-sequence (SEQ2SEQ) paradigm. Typically, the model is equipped with a single set of learned parameters to generate responses for given input contexts. When confronting diverse conversations, its adaptability is rather limited and the model is hence prone to generate generic responses. In this work, we propose an Adaptive Neural Dialogue generation model, AdaND, which manages various conversations with conversation-specific parameterization. For each conversation, the model generates parameters of the encoder-decoder by referring to the input context. In particular, we propose two adaptive parameterization mechanisms: a context-aware and a topic-aware parameterization mechanism. The context-aware parameterization directly generates the parameters by capturing local semantics of the given context. The topic-aware parameterization enables parameter sharing among conversations with similar topics by first inferring the latent topics of the given context and then generating the parameters with respect to the distributional topics. Extensive experiments conducted on a large-scale real-world conversational dataset show that our model achieves superior performance in terms of both quantitative metrics and human evaluations. %R 10.18653/v1/D19-1188 %U https://aclanthology.org/D19-1188 %U https://doi.org/10.18653/v1/D19-1188 %P 1793-1802