Discourse Sensitivity in Attraction Effects: The Interplay Between Language Model Size and Training Data

Sanghee Kim, Forrest Davis


Anthology ID:
2025.scil-1.24
Volume:
Proceedings of the Society for Computation in Linguistics 2025
Month:
July
Year:
2025
Address:
Eugene, Oregon
Editors:
Carolyn Jane Anderson, Frédéric Mailhot, Grusha Prasad
Venue:
SCiL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
287–299
Language:
URL:
https://aclanthology.org/2025.scil-1.24/
DOI:
Bibkey:
Cite (ACL):
Sanghee Kim and Forrest Davis. 2025. Discourse Sensitivity in Attraction Effects: The Interplay Between Language Model Size and Training Data. In Proceedings of the Society for Computation in Linguistics 2025, pages 287–299, Eugene, Oregon. Association for Computational Linguistics.
Cite (Informal):
Discourse Sensitivity in Attraction Effects: The Interplay Between Language Model Size and Training Data (Kim & Davis, SCiL 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.scil-1.24.pdf