2025
pdf
bib
Developing A German Document-Level Parallel Dataset For Automatic Text Simplification To Generate Easy Language
Vivien Jiranek
|
Stefan Hillmann
Proceedings of the 21st Conference on Natural Language Processing (KONVENS 2025): Workshops
pdf
bib
abs
Evaluating Large Language Models for Enhancing Live Chat Therapy: A Comparative Study with Psychotherapists
Neha Pravin Deshpande
|
Stefan Hillmann
|
Sebastian Möller
Proceedings of the 26th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Large Language Models (LLMs) hold promise for addressing the shortage of qualified therapists in mental health care. While chatbot-based Cognitive Behavioral Therapy (CBT) tools exist, their efficacy in sensitive contexts remains underexplored. This study examines the potential of LLMs to support therapy sessions aimed at reducing Child Sexual Abuse Material (CSAM) consumption. We propose a Retrieval-Augmented Generation (RAG) framework that leverages a fine-tuned BERT-based retriever to guide LLM-generated responses, better capturing the multi-turn, context-specific dynamics of therapy. Four LLMs—Qwen2-7B-Instruct, Mistral-7B-Instruct-v0.3, Orca-2-13B, and Zephyr-7B-Alpha—were evaluated in a small-scale study with 14 domain-expert psychotherapists. Our comparative analysis reveals that, in certain scenarios, LLMs like Mistral-7B-Instruct-v0.3 and Orca-2-13B were preferred over human therapist responses. While limited by sample size, these findings suggest that LLMs can perform at a level comparable to or even exceeding that of human therapists, especially in therapy focused on reducing CSAM consumption. Our code is available online: https://git.tu-berlin.de/neha.deshpande/therapy_responses/-/tree/main
2023
pdf
bib
abs
Context-Aware Module Selection in Modular Dialog Systems
Jan Nehring
|
René Marcel Berk
|
Stefan Hillmann
Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing
In modular dialog systems, a dialog system consists of multiple conversational agents. The task “module selection” selects the appropriate sub-dialog system for an incoming user utterance. Current models for module selection use features derived from the current user turn only, such as the utterances text or confidence values of the natural language understanding systems of the individual conversational agents, or they perform text classification on the user utterance. However, dialogs often span multiple turns, and turns are embedded into a context. Therefore, looking at the current user turn only is a source of error in certain situations. This work proposes four models for module selection that include the dialog history and the current user turn into module selection. We show that these models surpass the current state of the art in module selection.
2022
pdf
bib
abs
Towards Personality-Aware Chatbots
Daniel Fernau
|
Stefan Hillmann
|
Nils Feldhus
|
Tim Polzehl
|
Sebastian Möller
Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Chatbots are increasingly used to automate operational processes in customer service. However, most chatbots lack adaptation towards their users which may results in an unsatisfactory experience. Since knowing and meeting personal preferences is a key factor for enhancing usability in conversational agents, in this study we analyze an adaptive conversational agent that can automatically adjust according to a user’s personality type carefully excerpted from the Myers-Briggs type indicators. An experiment including 300 crowd workers examined how typifications like extroversion/introversion and thinking/feeling can be assessed and designed for a conversational agent in a job recommender domain. Our results validate the proposed design choices, and experiments on a user-matched personality typification, following the so-called law of attraction rule, show a significant positive influence on a range of selected usability criteria such as overall satisfaction, naturalness, promoter score, trust and appropriateness of the conversation.