Samuel Murphy
2024
PATIENT-π: Using Large Language Models to Simulate Patients for Training Mental Health Professionals
Ruiyi Wang
|
Stephanie Milani
|
Jamie Chiu
|
Jiayin Zhi
|
Shaun Eack
|
Travis Labrum
|
Samuel Murphy
|
Nev Jones
|
Kate Hardy
|
Hong Shen
|
Fei Fang
|
Zhiyu Chen
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Mental illness remains one of the most critical public health issues. Despite its importance, many mental health professionals highlight a disconnect between their training and actual real-world patient practice. To help bridge this gap, we propose PATIENT-π, a novel patient simulation framework for cognitive behavior therapy (CBT) training. To build PATIENT-π, we construct diverse patient cognitive models based on CBT principles and use large language models (LLMs) programmed with these cognitive models to act as a simulated therapy patient. We propose an interactive training scheme, PATIENT-π-TRAINER, for mental health trainees to practice a key skill in CBT β formulating the cognitive model of the patient β through role-playing a therapy session with PATIENT-π. To evaluate PATIENT-π, we conducted a comprehensive user study of 13 mental health trainees and 20 experts. The results demonstrate that practice using PATIENT-π-TRAINER enhances the perceived skill acquisition and confidence of the trainees beyond existing forms of training such as textbooks, videos, and role-play with non-patients. Based on the expertsβ perceptions, PATIENT-π is perceived to be closer to real patient interactions than GPT-4, and PATIENT-π-TRAINER holds strong promise to improve trainee competencies. Our code and data are released at https://github.com/ruiyiw/patient-psi.
Search
Co-authors
- Ruiyi Wang 1
- Stephanie Milani 1
- Jamie Chiu 1
- Jiayin Zhi 1
- Shaun Eack 1
- show all...