Structural Priming Demonstrates Abstract Grammatical Representations in Multilingual Language Models

James Michaelov, Catherine Arnett, Tyler Chang, Ben Bergen


Abstract
Abstract grammatical knowledge—of parts of speech and grammatical patterns—is key to the capacity for linguistic generalization in humans. But how abstract is grammatical knowledge in large language models? In the human literature, compelling evidence for grammatical abstraction comes from structural priming. A sentence that shares the same grammatical structure as a preceding sentence is processed and produced more readily. Because confounds exist when using stimuli in a single language, evidence of abstraction is even more compelling from crosslingual structural priming, where use of a syntactic structure in one language primes an analogous structure in another language. We measure crosslingual structural priming in large language models, comparing model behavior to human experimental results from eight crosslingual experiments covering six languages, and four monolingual structural priming experiments in three non-English languages. We find evidence for abstract monolingual and crosslingual grammatical representations in the models that function similarly to those found in humans. These results demonstrate that grammatical representations in multilingual language models are not only similar across languages, but they can causally influence text produced in different languages.
Anthology ID:
2023.emnlp-main.227
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3703–3720
Language:
URL:
https://aclanthology.org/2023.emnlp-main.227
DOI:
10.18653/v1/2023.emnlp-main.227
Bibkey:
Cite (ACL):
James Michaelov, Catherine Arnett, Tyler Chang, and Ben Bergen. 2023. Structural Priming Demonstrates Abstract Grammatical Representations in Multilingual Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 3703–3720, Singapore. Association for Computational Linguistics.
Cite (Informal):
Structural Priming Demonstrates Abstract Grammatical Representations in Multilingual Language Models (Michaelov et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.227.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.227.mp4