Personalized Large Language Model Assistant with Evolving Conditional Memory

Ruifeng Yuan, Shichao Sun, Yongqi Li, Zili Wang, Ziqiang Cao, Wenjie Li


Abstract
With the rapid development of large language models, AI assistants like ChatGPT have become increasingly integrated into people’s works and lives but are limited in personalized services. In this paper, we present a plug-and-play framework that could facilitate personalized large language model assistants with evolving conditional memory. The personalized assistant focuses on intelligently preserving the knowledge and experience from the history dialogue with the user, which can be applied to future tailored responses that better align with the user’s preferences. Generally, the assistant generates a set of records from the dialogue, stores them in a memory bank, and retrieves related memory to improve the quality of the response. For the crucial memory design, we explore different ways of constructing the memory and propose a new memorizing mechanism named conditional memory to enhance the memory management of the framework. We also investigate the retrieval and usage of memory in the generation process. To better evaluate the personalized assistants’ abilities, we build the first evaluation benchmark from three critical aspects: continuing previous dialogue, learning personalized knowledge and learning from user feedback. The experimental results illustrate the effectiveness of our method.
Anthology ID:
2025.coling-main.254
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3764–3777
Language:
URL:
https://aclanthology.org/2025.coling-main.254/
DOI:
Bibkey:
Cite (ACL):
Ruifeng Yuan, Shichao Sun, Yongqi Li, Zili Wang, Ziqiang Cao, and Wenjie Li. 2025. Personalized Large Language Model Assistant with Evolving Conditional Memory. In Proceedings of the 31st International Conference on Computational Linguistics, pages 3764–3777, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Personalized Large Language Model Assistant with Evolving Conditional Memory (Yuan et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.254.pdf