Personalizing Dialogue Agents via Meta-Learning

Andrea Madotto, Zhaojiang Lin, Chien-Sheng Wu, Pascale Fung


Abstract
Existing personalized dialogue models use human designed persona descriptions to improve dialogue consistency. Collecting such descriptions from existing dialogues is expensive and requires hand-crafted feature designs. In this paper, we propose to extend Model-Agnostic Meta-Learning (MAML) (Finn et al., 2017) to personalized dialogue learning without using any persona descriptions. Our model learns to quickly adapt to new personas by leveraging only a few dialogue samples collected from the same user, which is fundamentally different from conditioning the response on the persona descriptions. Empirical results on Persona-chat dataset (Zhang et al., 2018) indicate that our solution outperforms non-meta-learning baselines using automatic evaluation metrics, and in terms of human-evaluated fluency and consistency.
Anthology ID:
P19-1542
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5454–5459
Language:
URL:
https://aclanthology.org/P19-1542
DOI:
10.18653/v1/P19-1542
Bibkey:
Cite (ACL):
Andrea Madotto, Zhaojiang Lin, Chien-Sheng Wu, and Pascale Fung. 2019. Personalizing Dialogue Agents via Meta-Learning. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5454–5459, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Personalizing Dialogue Agents via Meta-Learning (Madotto et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1542.pdf
Supplementary:
 P19-1542.Supplementary.pdf
Code
 HLTCHKUST/PAML