Evaluating the Performance of Transformer-based Language Models for Neuroatypical Language

Duanchen Liu, Zoey Liu, Qingyun Yang, Yujing Huang, Emily Prud’hommeaux


Abstract
Difficulties with social aspects of language are among the hallmarks of autism spectrum disorder (ASD). These communication differences are thought to contribute to the challenges that adults with ASD experience when seeking employment, underscoring the need for interventions that focus on improving areas of weakness in pragmatic and social language. In this paper, we describe a transformer-based framework for identifying linguistic features associated with social aspects of communication using a corpus of conversations between adults with and without ASD and neurotypical conversational partners produced while engaging in collaborative tasks. While our framework yields strong accuracy overall, performance is significantly worse for the language of participants with ASD, suggesting that they use a more diverse set of strategies for some social linguistic functions. These results, while showing promise for the development of automated language analysis tools to support targeted language interventions for ASD, also reveal weaknesses in the ability of large contextualized language models to model neuroatypical language.
Anthology ID:
2022.coling-1.301
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3412–3419
Language:
URL:
https://aclanthology.org/2022.coling-1.301
DOI:
Bibkey:
Cite (ACL):
Duanchen Liu, Zoey Liu, Qingyun Yang, Yujing Huang, and Emily Prud’hommeaux. 2022. Evaluating the Performance of Transformer-based Language Models for Neuroatypical Language. In Proceedings of the 29th International Conference on Computational Linguistics, pages 3412–3419, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Evaluating the Performance of Transformer-based Language Models for Neuroatypical Language (Liu et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.301.pdf