Think Before You Speak: Cultivating Communication Skills of Large Language Models via Inner Monologue

Junkai Zhou, Liang Pang, Huawei Shen, Xueqi Cheng


Abstract
The emergence of large language models (LLMs) further improves the capabilities of open-domain dialogue systems and can generate fluent, coherent, and diverse responses. However, LLMs still lack a crucial ability: communication skills. This limitation renders them more like information seeking tools rather than anthropomorphic chatbots. Communication skills, such as topic transition, proactively asking questions, concept guidance, empathy, and summarising often should be taken into consideration, to make LLMs more anthropomorphic and proactive during the conversation, thereby increasing the interest of users and attracting them to chat for longer. However, enabling these communication skills in black-box LLMs remains a key challenge because they do not have the same utterance formation mode as real people: think before speaking. Inspired by linguistics and cognitive science, we empower LLMs with communication skills through inner monologues. To evaluate various communication skills, we construct a benchmark named Cskills, which can also more comprehensively evaluate the dialogue generation ability of the model. Experimental results show that the proposed CSIM strategy improves the backbone models and outperforms the baselines.
Anthology ID:
2024.findings-naacl.248
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3925–3951
Language:
URL:
https://aclanthology.org/2024.findings-naacl.248
DOI:
Bibkey:
Cite (ACL):
Junkai Zhou, Liang Pang, Huawei Shen, and Xueqi Cheng. 2024. Think Before You Speak: Cultivating Communication Skills of Large Language Models via Inner Monologue. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 3925–3951, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Think Before You Speak: Cultivating Communication Skills of Large Language Models via Inner Monologue (Zhou et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.248.pdf
Copyright:
 2024.findings-naacl.248.copyright.pdf