Large Language Models for Psycholinguistic Plausibility Pretesting

Samuel Amouyal, Aya Meltzer-Asscher, Jonathan Berant


Abstract
In psycholinguistics, the creation of controlled materials is crucial to ensure that research outcomes are solely attributed to the intended manipulations and not influenced by extraneous factors. To achieve this, psycholinguists typically pretest linguistic materials, where a common pretest is to solicit plausibility judgments from human evaluators on specific sentences. In this work, we investigate whether Language Models (LMs) can be used to generate these plausibility judgements. We investigate a wide range of LMs across multiple linguistic structures and evaluate whether their plausibility judgements correlate with human judgements. We find that GPT-4 plausibility judgements highly correlate with human judgements across the structures we examine, whereas other LMs correlate well with humans on commonly used syntactic structures. We then test whether this correlation implies that LMs can be used instead of humans for pretesting. We find that when coarse-grained plausibility judgements are needed, this works well, but when fine-grained judgements are necessary, even GPT-4 does not provide satisfactory discriminative power.
Anthology ID:
2024.findings-eacl.12
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
166–181
Language:
URL:
https://aclanthology.org/2024.findings-eacl.12
DOI:
Bibkey:
Cite (ACL):
Samuel Amouyal, Aya Meltzer-Asscher, and Jonathan Berant. 2024. Large Language Models for Psycholinguistic Plausibility Pretesting. In Findings of the Association for Computational Linguistics: EACL 2024, pages 166–181, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Large Language Models for Psycholinguistic Plausibility Pretesting (Amouyal et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.12.pdf
Software:
 2024.findings-eacl.12.software.zip
Note:
 2024.findings-eacl.12.note.zip