Linear Recency Bias During Training Improves Transformers’ Fit to Reading Times

Christian Clark, Byung-Doh Oh, William Schuler


Abstract
Recent psycholinguistic research has compared human reading times to surprisal estimates from language models to study the factors shaping human sentence processing difficulty. Previous studies have shown a strong fit between surprisal values from Transformers and reading times. However, standard Transformers work with a lossless representation of the entire previous linguistic context, unlike models of human language processing that include memory decay. To bridge this gap, this paper evaluates a modification of the Transformer model that uses ALiBi (Press et al., 2022), a recency bias added to attention scores. Surprisal estimates from a Transformer that includes ALiBi during training and inference show an improved fit to human reading times compared to a standard Transformer baseline. A subsequent analysis of attention heads suggests that ALiBi’s mixture of slopes—which determine the rate of memory decay in each attention head—may play a role in the improvement by helping models with ALiBi to track different kinds of linguistic dependencies.
Anthology ID:
2025.coling-main.517
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7735–7747
Language:
URL:
https://aclanthology.org/2025.coling-main.517/
DOI:
Bibkey:
Cite (ACL):
Christian Clark, Byung-Doh Oh, and William Schuler. 2025. Linear Recency Bias During Training Improves Transformers’ Fit to Reading Times. In Proceedings of the 31st International Conference on Computational Linguistics, pages 7735–7747, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Linear Recency Bias During Training Improves Transformers’ Fit to Reading Times (Clark et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.517.pdf