Debaditya Pal


2025

pdf bib
Beyond Simple Personas: Evaluating LLMs and Relevance Models for Character-Consistent Dialogue
Debaditya Pal | David Traum
Proceedings of the 26th Annual Meeting of the Special Interest Group on Discourse and Dialogue

Dialogue systems often rely on overly simplistic persona representations, limiting their capacity to portray realistic, nuanced characters. In this paper, we explore how well existing persona-grounding methods capture complex personalities using two character-rich domains—Sgt Blackwell (single-character) and Twins (two-character)—described extensively through detailed narratives. We compare early fusion techniques, Retrieval-Augmented Generation (RAG), and relevance-based approaches. Evaluations across entailment, persona alignment, and hallucination metrics reveal distinct trade-offs: Knowledge Graph fusion notably reduces hallucinations and maintains relevance, Persona fusion strongly preserves relevance but has higher hallucination rates, and RAG provides fast, fluent responses. Our findings emphasize the critical role of structured persona grounding in achieving nuanced personality modeling.