Comparing Eye-gaze and Transformer Attention Mechanisms in Reading Tasks

Maria Mouratidi, Massimo Poesio


Abstract
As transformers become increasingly prevalent in NLP research, evaluating their cognitive alignment with human language processing has become essential for validating them as models of human language. This study compares eye-gaze patterns in human reading with transformer attention using different attention representations (raw attention, attention flow, gradient-based saliency). We employ both statistical correlation analysis and predictive modeling using PCA-reduced representations of eye-tracking features across two reading tasks. The findings reveal lower correlations and predictive capacity for the decoder model compared to the encoder model, with implications for the gap between behavioral performance and cognitive plausibility of different transformer designs.
Anthology ID:
2025.gaze4nlp-1.4
Volume:
Proceedings of the First International Workshop on Gaze Data and Natural Language Processing
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Cengiz Acarturk, Jamal Nasir, Burcu Can, Cagrı Coltekin
Venues:
Gaze4NLP | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, BULGARIA
Note:
Pages:
26–36
Language:
URL:
https://aclanthology.org/2025.gaze4nlp-1.4/
DOI:
Bibkey:
Cite (ACL):
Maria Mouratidi and Massimo Poesio. 2025. Comparing Eye-gaze and Transformer Attention Mechanisms in Reading Tasks. In Proceedings of the First International Workshop on Gaze Data and Natural Language Processing, pages 26–36, Varna, Bulgaria. INCOMA Ltd., Shoumen, BULGARIA.
Cite (Informal):
Comparing Eye-gaze and Transformer Attention Mechanisms in Reading Tasks (Mouratidi & Poesio, Gaze4NLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.gaze4nlp-1.4.pdf