Modeling Construction Grammar’s Way into NLP: Insights from negative results in automatically identifying schematic clausal constructions in Brazilian Portuguese

Arthur Lorenzi, Vânia Gomes de Almeida, Ely Edison Matos, Tiago Timponi Torrent


Abstract
This paper reports on negative results in a task of automatic identification of schematic clausal constructions and their elements in Brazilian Portuguese. The experiment was set up so as to test whether form and meaning properties of constructions, modeled in terms of Universal Dependencies and FrameNet Frames in a Constructicon, would improve the performance of transformer models in the task. Qualitative analysis of the results indicate that alternatives to the linearization of those properties, dataset size and a post-processing module should be explored in the future as a means to make use of information in Constructicons for NLP tasks.
Anthology ID:
2023.cxgsnlp-1.11
Original:
2023.cxgsnlp-1.11v1
Version 2:
2023.cxgsnlp-1.11v2
Volume:
Proceedings of the First International Workshop on Construction Grammars and NLP (CxGs+NLP, GURT/SyntaxFest 2023)
Month:
March
Year:
2023
Address:
Washington, D.C.
Editors:
Claire Bonial, Harish Tayyar Madabushi
Venues:
CxGsNLP | SyntaxFest
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
96–109
Language:
URL:
https://aclanthology.org/2023.cxgsnlp-1.11
DOI:
Bibkey:
Cite (ACL):
Arthur Lorenzi, Vânia Gomes de Almeida, Ely Edison Matos, and Tiago Timponi Torrent. 2023. Modeling Construction Grammar’s Way into NLP: Insights from negative results in automatically identifying schematic clausal constructions in Brazilian Portuguese. In Proceedings of the First International Workshop on Construction Grammars and NLP (CxGs+NLP, GURT/SyntaxFest 2023), pages 96–109, Washington, D.C.. Association for Computational Linguistics.
Cite (Informal):
Modeling Construction Grammar’s Way into NLP: Insights from negative results in automatically identifying schematic clausal constructions in Brazilian Portuguese (Lorenzi et al., CxGsNLP-SyntaxFest 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.cxgsnlp-1.11.pdf
Video:
 https://aclanthology.org/2023.cxgsnlp-1.11.mov