%0 Conference Proceedings %T Long Time No See! Open-Domain Conversation with Long-Term Persona Memory %A Xu, Xinchao %A Gou, Zhibin %A Wu, Wenquan %A Niu, Zheng-Yu %A Wu, Hua %A Wang, Haifeng %A Wang, Shihang %Y Muresan, Smaranda %Y Nakov, Preslav %Y Villavicencio, Aline %S Findings of the Association for Computational Linguistics: ACL 2022 %D 2022 %8 May %I Association for Computational Linguistics %C Dublin, Ireland %F xu-etal-2022-long %X Most of the open-domain dialogue models tend to perform poorly in the setting of long-term human-bot conversations. The possible reason is that they lack the capability of understanding and memorizing long-term dialogue history information. To address this issue, we present a novel task of Long-term Memory Conversation (LeMon) and then build a new dialogue dataset DuLeMon and a dialogue generation framework with Long-Term Memory (LTM) mechanism (called PLATO-LTM). This LTM mechanism enables our system to accurately extract and continuously update long-term persona memory without requiring multiple-session dialogue datasets for model training. To our knowledge, this is the first attempt to conduct real-time dynamic management of persona information of both parties, including the user and the bot. Results on DuLeMon indicate that PLATO-LTM can significantly outperform baselines in terms of long-term dialogue consistency, leading to better dialogue engagingness. %R 10.18653/v1/2022.findings-acl.207 %U https://aclanthology.org/2022.findings-acl.207 %U https://doi.org/10.18653/v1/2022.findings-acl.207 %P 2639-2650