Construction Grammar Provides Unique Insight into Neural Language Models

Leonie Weissweiler, Taiqi He, Naoki Otani, David R. Mortensen, Lori Levin, Hinrich Schütze


Abstract
Construction Grammar (CxG) has recently been used as the basis for probing studies that have investigated the performance of large pretrained language models (PLMs) with respect to the structure and meaning of constructions. In this position paper, we make suggestions for the continuation and augmentation of this line of research. We look at probing methodology that was not designed with CxG in mind, as well as probing methodology that was designed for specific constructions. We analyse selected previous work in detail, and provide our view of the most important challenges and research questions that this promising new field faces.
Anthology ID:
2023.cxgsnlp-1.10
Volume:
Proceedings of the First International Workshop on Construction Grammars and NLP (CxGs+NLP, GURT/SyntaxFest 2023)
Month:
March
Year:
2023
Address:
Washington, D.C.
Editors:
Claire Bonial, Harish Tayyar Madabushi
Venues:
CxGsNLP | SyntaxFest
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
85–95
Language:
URL:
https://aclanthology.org/2023.cxgsnlp-1.10
DOI:
Bibkey:
Cite (ACL):
Leonie Weissweiler, Taiqi He, Naoki Otani, David R. Mortensen, Lori Levin, and Hinrich Schütze. 2023. Construction Grammar Provides Unique Insight into Neural Language Models. In Proceedings of the First International Workshop on Construction Grammars and NLP (CxGs+NLP, GURT/SyntaxFest 2023), pages 85–95, Washington, D.C.. Association for Computational Linguistics.
Cite (Informal):
Construction Grammar Provides Unique Insight into Neural Language Models (Weissweiler et al., CxGsNLP-SyntaxFest 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.cxgsnlp-1.10.pdf
Video:
 https://aclanthology.org/2023.cxgsnlp-1.10.mov