A Language Model with Limited Memory Capacity Captures Interference in Human Sentence Processing

William Timkey, Tal Linzen


Abstract
Two of the central factors believed to underpin human sentence processing difficulty are expectations and retrieval from working memory. A recent attempt to create a unified cognitive model integrating these two factors have relied on the parallels between the self-attention mechanism of transformer language models and cue-based retrieval theories of working memory in human sentence processing (Ryu and Lewis 2021). While the authors show that attention patterns in specialized attention heads of GPT-2 are consistent with a key prediction of cue-based retrieval models, similarity-based interference effects, their method requires the identification of syntactically specialized attention heads, and makes an cognitively implausible implicit assumption that hundreds of memory retrieval operations take place in parallel. In the present work, we develop a recurrent neural language model with a single self-attention head, which more closely parallels the memory system assumed by cognitive theories. We show that our model’s single attention head can capture semantic and syntactic interference effects observed in human experiments.
Anthology ID:
2023.findings-emnlp.582
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8705–8720
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.582
DOI:
10.18653/v1/2023.findings-emnlp.582
Bibkey:
Cite (ACL):
William Timkey and Tal Linzen. 2023. A Language Model with Limited Memory Capacity Captures Interference in Human Sentence Processing. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8705–8720, Singapore. Association for Computational Linguistics.
Cite (Informal):
A Language Model with Limited Memory Capacity Captures Interference in Human Sentence Processing (Timkey & Linzen, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.582.pdf