Reverse-Engineering the Reader

Samuel Kiegeland, Ethan Wilcox, Afra Amini, David Reich, Ryan Cotterell


Abstract
Numerous previous studies have sought to determine to what extent language models, pretrained on natural language text, can serve as useful models of human cognition.In this paper, we are interested in the opposite question: whether we can directly optimize a language model to be a useful cognitive model by aligning it to human psychometric data.To achieve this, we introduce a novel alignment technique in which we fine-tune a language model to implicitly optimize the parameters of a linear regressor that directly predicts humans’ reading times of in-context linguistic units, e.g., phonemes, morphemes, or words, using surprisal estimates derived from the language model. Using words as a test case, we evaluate our technique across multiple model sizes and datasets and find that it improves language models’ psychometric predictive power.However, we find an inverse relationship between psychometric power and a model’s performance on downstream NLP tasks as well as its perplexity on held-out test data.While this latter trend has been observed before (Oh et al., 2022; Shain et al., 2024), we are the first to induce it by manipulating a model’s alignment to psychometric data.
Anthology ID:
2024.emnlp-main.526
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9367–9389
Language:
URL:
https://aclanthology.org/2024.emnlp-main.526
DOI:
Bibkey:
Cite (ACL):
Samuel Kiegeland, Ethan Wilcox, Afra Amini, David Reich, and Ryan Cotterell. 2024. Reverse-Engineering the Reader. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 9367–9389, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Reverse-Engineering the Reader (Kiegeland et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.526.pdf