Transformer-Based Language Model Surprisal Predicts Human Reading Times Best with About Two Billion Training Tokens Byung-Doh Oh author William Schuler author 2023-12 text Findings of the Association for Computational Linguistics: EMNLP 2023 Houda Bouamor editor Juan Pino editor Kalika Bali editor Association for Computational Linguistics Singapore conference publication oh-schuler-2023-transformer 10.18653/v1/2023.findings-emnlp.128 https://aclanthology.org/2023.findings-emnlp.128/ 2023-12 1915 1921