Lossy Context Surprisal Predicts Task-Dependent Patterns in Relative Clause Processing

Kate McCurdy, Michael Hahn


Abstract
English relative clauses are a critical test case for theories of syntactic processing. Expectation- and memory-based accounts make opposing predictions, and behavioral experiments have found mixed results. We present a technical extension of Lossy Context Surprisal (LCS) and use it to model relative clause processing in three behavioral experiments. LCS predicts key results at distinct retention rates, showing that task-dependent memory demands can account for discrepant behavioral patterns in the literature.
Anthology ID:
2024.conll-1.4
Volume:
Proceedings of the 28th Conference on Computational Natural Language Learning
Month:
November
Year:
2024
Address:
Miami, FL, USA
Editors:
Libby Barak, Malihe Alikhani
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
36–45
Language:
URL:
https://aclanthology.org/2024.conll-1.4
DOI:
Bibkey:
Cite (ACL):
Kate McCurdy and Michael Hahn. 2024. Lossy Context Surprisal Predicts Task-Dependent Patterns in Relative Clause Processing. In Proceedings of the 28th Conference on Computational Natural Language Learning, pages 36–45, Miami, FL, USA. Association for Computational Linguistics.
Cite (Informal):
Lossy Context Surprisal Predicts Task-Dependent Patterns in Relative Clause Processing (McCurdy & Hahn, CoNLL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.conll-1.4.pdf