A Neural Model of Adaptation in Reading

Marten van Schijndel, Tal Linzen


Abstract
It has been argued that humans rapidly adapt their lexical and syntactic expectations to match the statistics of the current linguistic context. We provide further support to this claim by showing that the addition of a simple adaptation mechanism to a neural language model improves our predictions of human reading times compared to a non-adaptive model. We analyze the performance of the model on controlled materials from psycholinguistic experiments and show that it adapts not only to lexical items but also to abstract syntactic structures.
Anthology ID:
D18-1499
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4704–4710
Language:
URL:
https://aclanthology.org/D18-1499
DOI:
10.18653/v1/D18-1499
Bibkey:
Cite (ACL):
Marten van Schijndel and Tal Linzen. 2018. A Neural Model of Adaptation in Reading. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4704–4710, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Neural Model of Adaptation in Reading (van Schijndel & Linzen, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1499.pdf
Attachment:
 D18-1499.Attachment.zip
Video:
 https://vimeo.com/306153668
Code
 vansky/neural-complexity
Data
Natural Stories