What Can String Probability Tell Us About Grammaticality?

Jennifer Hu, Ethan Gotlieb Wilcox, Siyuan Song, Kyle Mahowald, Roger P. Levy


Abstract
What have language models (LMs) learned about grammar? This question remains hotly debated, with major ramifications for linguistic theory. However, since probability and grammaticality are distinct notions in linguistics, it is not obvious what string probabilities can reveal about an LM’s underlying grammatical knowledge. We present a theoretical analysis of the relationship between grammar, meaning, and string probability, based on simple assumptions about the generative process of corpus data. Our framework makes three predictions, which we validate empirically using 280K sentence pairs in English and Chinese: (1) correlation between the probability of strings within minimal pairs, i.e., string pairs with minimal semantic differences; (2) correlation between models’ and humans’ deltas within minimal pairs; and (3) poor separation in probability space between unpaired grammatical and ungrammatical strings. Our analyses give theoretical grounding for using probability to learn about LMs’ structural knowledge, and suggest directions for future work in LM grammatical evaluation.
Anthology ID:
2026.tacl-1.7
Volume:
Transactions of the Association for Computational Linguistics, Volume 14
Month:
Year:
2026
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
124–146
Language:
URL:
https://aclanthology.org/2026.tacl-1.7/
DOI:
10.1162/tacl.a.611
Bibkey:
Cite (ACL):
Jennifer Hu, Ethan Gotlieb Wilcox, Siyuan Song, Kyle Mahowald, and Roger P. Levy. 2026. What Can String Probability Tell Us About Grammaticality?. Transactions of the Association for Computational Linguistics, 14:124–146.
Cite (Informal):
What Can String Probability Tell Us About Grammaticality? (Hu et al., TACL 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.tacl-1.7.pdf