Scaling in Cognitive Modelling: a Multilingual Approach to Human Reading Times

Andrea de Varda, Marco Marelli


Abstract
Neural language models are increasingly valued in computational psycholinguistics, due to their ability to provide conditional probability distributions over the lexicon that are predictive of human processing times. Given the vast array of available models, it is of both theoretical and methodological importance to assess what features of a model influence its psychometric quality. In this work we focus on parameter size, showing that larger Transformer-based language models generate probabilistic estimates that are less predictive of early eye-tracking measurements reflecting lexical access and early semantic integration. However, relatively bigger models show an advantage in capturing late eye-tracking measurements that reflect the full semantic and syntactic integration of a word into the current language context. Our results are supported by eye movement data in ten languages and consider four models, spanning from 564M to 4.5B parameters.
Anthology ID:
2023.acl-short.14
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
139–149
Language:
URL:
https://aclanthology.org/2023.acl-short.14
DOI:
10.18653/v1/2023.acl-short.14
Bibkey:
Cite (ACL):
Andrea de Varda and Marco Marelli. 2023. Scaling in Cognitive Modelling: a Multilingual Approach to Human Reading Times. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 139–149, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Scaling in Cognitive Modelling: a Multilingual Approach to Human Reading Times (de Varda & Marelli, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.14.pdf
Video:
 https://aclanthology.org/2023.acl-short.14.mp4