Aligning to Adults Is Easy, Aligning to Children Is Hard: A Study of Linguistic Alignment in Dialogue Systems

Dorothea French, Sidney D’Mello, Katharina von der Wense


Abstract
During conversations, people align to one another over time, by using similar words, concepts, and syntax. This helps form a shared understanding of the conversational content and is associated with increased engagement and satisfaction. It also affects conversation outcomes: e.g., when talking to language learners, an above normal level of linguistic alignment of parents or language teachers is correlated with faster language acquisition. These benefits make human-like alignment an important property of dialogue systems, which has often been overlooked by the NLP community. In order to fill this gap, we ask: (RQ1) Due to the importance for engagement and satisfaction, to what degree do state-of-the-art dialogue systems align to adult users? (RQ2) With a potential application to child language acquisition in mind, do systems, similar to parents, show high levels of alignment during conversations with children? Our experiments show that ChatGPT aligns to adults at roughly human levels, while Llama2 shows elevated alignment. However, when responding to a child, both systems’ alignment is below human levels.
Anthology ID:
2024.hucllm-1.7
Volume:
Proceedings of the 1st Human-Centered Large Language Modeling Workshop
Month:
August
Year:
2024
Address:
TBD
Editors:
Nikita Soni, Lucie Flek, Ashish Sharma, Diyi Yang, Sara Hooker, H. Andrew Schwartz
Venues:
HuCLLM | WS
SIG:
Publisher:
ACL
Note:
Pages:
81–87
Language:
URL:
https://aclanthology.org/2024.hucllm-1.7
DOI:
Bibkey:
Cite (ACL):
Dorothea French, Sidney D’Mello, and Katharina von der Wense. 2024. Aligning to Adults Is Easy, Aligning to Children Is Hard: A Study of Linguistic Alignment in Dialogue Systems. In Proceedings of the 1st Human-Centered Large Language Modeling Workshop, pages 81–87, TBD. ACL.
Cite (Informal):
Aligning to Adults Is Easy, Aligning to Children Is Hard: A Study of Linguistic Alignment in Dialogue Systems (French et al., HuCLLM-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.hucllm-1.7.pdf