Compress to Impress: Unleashing the Potential of Compressive Memory in Real-World Long-Term Conversations

Nuo Chen, Hongguang Li, Jianhui Chang, Juhua Huang, Baoyuan Wang, Jia Li


Abstract
Existing retrieval-based methods have made significant strides in maintaining long-term conversations. However, these approaches face challenges in memory database management and accurate memory retrieval, hindering their efficacy in dynamic, real-world interactions. This study introduces a novel framework, COmpressive Memory-Enhanced Dialogue sYstems (COMEDY), which eschews traditional retrieval modules and memory databases. Instead, COMEDY adopts a “One-for-All” approach, utilizing a single language model to manage memory generation, compression, and response generation. Central to this framework is the concept of compressive memory, which integrates session-specific summaries, user-bot dynamics, and past events into a concise memory format. To support COMEDY, we collect the biggest Chinese long-term conversation dataset, Dolphin, derived from real user-chatbot interactions. Comparative evaluations demonstrate COMEDY’s superiority over traditional retrieval-based methods in producing more nuanced and human-like conversational experiences.
Anthology ID:
2025.coling-main.51
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
755–773
Language:
URL:
https://aclanthology.org/2025.coling-main.51/
DOI:
Bibkey:
Cite (ACL):
Nuo Chen, Hongguang Li, Jianhui Chang, Juhua Huang, Baoyuan Wang, and Jia Li. 2025. Compress to Impress: Unleashing the Potential of Compressive Memory in Real-World Long-Term Conversations. In Proceedings of the 31st International Conference on Computational Linguistics, pages 755–773, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Compress to Impress: Unleashing the Potential of Compressive Memory in Real-World Long-Term Conversations (Chen et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.51.pdf