Beyond Simple Personas: Evaluating LLMs and Relevance Models for Character-Consistent Dialogue

Debaditya Pal, David Traum


Abstract
Dialogue systems often rely on overly simplistic persona representations, limiting their capacity to portray realistic, nuanced characters. In this paper, we explore how well existing persona-grounding methods capture complex personalities using two character-rich domains—Sgt Blackwell (single-character) and Twins (two-character)—described extensively through detailed narratives. We compare early fusion techniques, Retrieval-Augmented Generation (RAG), and relevance-based approaches. Evaluations across entailment, persona alignment, and hallucination metrics reveal distinct trade-offs: Knowledge Graph fusion notably reduces hallucinations and maintains relevance, Persona fusion strongly preserves relevance but has higher hallucination rates, and RAG provides fast, fluent responses. Our findings emphasize the critical role of structured persona grounding in achieving nuanced personality modeling.
Anthology ID:
2025.sigdial-1.31
Volume:
Proceedings of the 26th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
August
Year:
2025
Address:
Avignon, France
Editors:
Frédéric Béchet, Fabrice Lefèvre, Nicholas Asher, Seokhwan Kim, Teva Merlin
Venue:
SIGDIAL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
383–396
Language:
URL:
https://aclanthology.org/2025.sigdial-1.31/
DOI:
Bibkey:
Cite (ACL):
Debaditya Pal and David Traum. 2025. Beyond Simple Personas: Evaluating LLMs and Relevance Models for Character-Consistent Dialogue. In Proceedings of the 26th Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 383–396, Avignon, France. Association for Computational Linguistics.
Cite (Informal):
Beyond Simple Personas: Evaluating LLMs and Relevance Models for Character-Consistent Dialogue (Pal & Traum, SIGDIAL 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.sigdial-1.31.pdf