PRINCIPLES: Synthetic Strategy Memory for Proactive Dialogue Agents

Namyoung Kim, Kai Tzu-iunn Ong, Yeonjun Hwang, Minseok Kang, Iiseo Jihn, Gayoung Kim, Minju Kim, Jinyoung Yeo


Abstract
Dialogue agents based on large language models (LLMs) have shown promising performance in proactive dialogue, which requires effective strategy planning. However, existing approaches to strategy planning for proactive dialogue face several limitations: limited strategy coverage, preference bias in planning, and reliance on costly additional training. To address these, we propose PRINCIPLES: a synthetic strategy memory for proactive dialogue agents. PRINCIPLES is derived through offline self-play simulations and serves as reusable knowledge that guides strategy planning during inference, eliminating the need for additional training and data annotation. We evaluate PRINCIPLES in both emotional support and persuasion domains, demonstrating consistent improvements over strong baselines. Furthermore, PRINCIPLES maintains its robustness across extended and more diverse evaluation settings. See our project page at https://huggingface.co/spaces/kimnamssya/Principles.
Anthology ID:
2025.findings-emnlp.1164
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21329–21368
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.1164/
DOI:
Bibkey:
Cite (ACL):
Namyoung Kim, Kai Tzu-iunn Ong, Yeonjun Hwang, Minseok Kang, Iiseo Jihn, Gayoung Kim, Minju Kim, and Jinyoung Yeo. 2025. PRINCIPLES: Synthetic Strategy Memory for Proactive Dialogue Agents. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 21329–21368, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
PRINCIPLES: Synthetic Strategy Memory for Proactive Dialogue Agents (Kim et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.1164.pdf
Checklist:
 2025.findings-emnlp.1164.checklist.pdf