Do Transformer Models Show Similar Attention Patterns to Task-Specific Human Gaze?

Oliver Eberle, Stephanie Brandl, Jonas Pilot, Anders Søgaard


Abstract
Learned self-attention functions in state-of-the-art NLP models often correlate with human attention. We investigate whether self-attention in large-scale pre-trained language models is as predictive of human eye fixation patterns during task-reading as classical cognitive models of human attention. We compare attention functions across two task-specific reading datasets for sentiment analysis and relation extraction. We find the predictiveness of large-scale pre-trained self-attention for human attention depends on ‘what is in the tail’, e.g., the syntactic nature of rare contexts. Further, we observe that task-specific fine-tuning does not increase the correlation with human task-specific reading. Through an input reduction experiment we give complementary insights on the sparsity and fidelity trade-off, showing that lower-entropy attention vectors are more faithful.
Anthology ID:
2022.acl-long.296
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4295–4309
Language:
URL:
https://aclanthology.org/2022.acl-long.296
DOI:
10.18653/v1/2022.acl-long.296
Bibkey:
Cite (ACL):
Oliver Eberle, Stephanie Brandl, Jonas Pilot, and Anders Søgaard. 2022. Do Transformer Models Show Similar Attention Patterns to Task-Specific Human Gaze?. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4295–4309, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Do Transformer Models Show Similar Attention Patterns to Task-Specific Human Gaze? (Eberle et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.296.pdf
Video:
 https://aclanthology.org/2022.acl-long.296.mp4
Code
 oeberle/task_gaze_transformers
Data
SST