Wing-Yee Chow
2025
Extracting structure from an LLM - how to improve on surprisal-based models of Human Language Processing
Daphne P. Wang
|
Mehrnoosh Sadrzadeh
|
Miloš Stanojević
|
Wing-Yee Chow
|
Richard Breheny
Proceedings of the 31st International Conference on Computational Linguistics
Prediction and reanalysis are considered two key processes that underly humans’ capacity to comprehend language in real time. Computational models capture it using Large Language Models (LLMs) and a statistical measure known as ‘surprisal’. Despite successes of LLMs, surprisal-based models face challenges when it comes to sentences requiring reanalysis due to pervasive temporary structural ambiguities, such as garden path sentences. We ask whether structural information can be extracted from LLM’s and develop a model that integrates it with their learnt statistics. When applied to a dataset of garden path sentences, the model achieved a significantly higher correlation with human reading times than surprisal. It also provided a better prediction of the garden path effect and could distinguish between sentence types with different levels of difficulty.
2024
How can large language models become more human?
Daphne Wang
|
Mehrnoosh Sadrzadeh
|
Miloš Stanojević
|
Wing-Yee Chow
|
Richard Breheny
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Psycholinguistic experiments reveal that efficiency of human language use is founded on predictions at both syntactic and lexical levels. Previous models of human prediction exploiting LLMs have used an information theoretic measure called surprisal, with success on naturalistic text in a wide variety of languages, but under-performance on challenging text such as garden path sentences. This paper introduces a novel framework that combines the lexical predictions of an LLM with the syntactic structures provided by a dependency parser. The framework gives rise to an Incompatibility Fraction. When tested on two garden path datasets, it correlated well with human reading times, distinguished between easy and hard garden path, and outperformed surprisal.