Michael Wolfman


2024

pdf bib
Hierarchical syntactic structure in human-like language models
Michael Wolfman | Donald Dunagan | Jonathan Brennan | John Hale
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics

Language models (LMs) are a meeting point for cognitive modeling and computational linguistics. How should they be designed to serve as adequate cognitive models? To address this question, this study contrasts two Transformer-based LMs that share the same architecture. Only one of them analyzes sentences in terms of explicit hierarchical structure. Evaluating the two LMs against fMRI time series via the surprisal complexity metric, the results implicate the superior temporal gyrus. These findings underline the need for hierarchical sentence structures in word-by-word models of human language comprehension.