High-dimensional vector spaces can accommodate constructional features quite conveniently

Jussi Karlgren


Abstract
Current language processing tools presuppose input in the form of a sequence of high-dimensional vectors with continuous values. Lexical items can be converted to such vectors with standard methodology and subsequent processing is assumed to handle structural features of the string. Constructional features do typically not fit in that processing pipeline: they are not as clearly sequential, they overlap with other items, and the fact that they are combinations of lexical items obscures their ontological status as observable linguistic items in their own right. Constructional grammar frameworks allow for a more general view on how to understand lexical items and their configurations in a common framework. This paper introduces an approach to accommodate that understanding in a vector symbolic architecture, a processing framework which allows for combinations of continuous vectors and discrete items, convenient for various downstream processing using e.g. neural processing or other tools which expect input in vector form.
Anthology ID:
2023.cxgsnlp-1.4
Volume:
Proceedings of the First International Workshop on Construction Grammars and NLP (CxGs+NLP, GURT/SyntaxFest 2023)
Month:
March
Year:
2023
Address:
Washington, D.C.
Editors:
Claire Bonial, Harish Tayyar Madabushi
Venues:
CxGsNLP | SyntaxFest
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
31–35
Language:
URL:
https://aclanthology.org/2023.cxgsnlp-1.4
DOI:
Bibkey:
Cite (ACL):
Jussi Karlgren. 2023. High-dimensional vector spaces can accommodate constructional features quite conveniently. In Proceedings of the First International Workshop on Construction Grammars and NLP (CxGs+NLP, GURT/SyntaxFest 2023), pages 31–35, Washington, D.C.. Association for Computational Linguistics.
Cite (Informal):
High-dimensional vector spaces can accommodate constructional features quite conveniently (Karlgren, CxGsNLP-SyntaxFest 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.cxgsnlp-1.4.pdf
Video:
 https://aclanthology.org/2023.cxgsnlp-1.4.mov