Cabbage Sweeter than Cake? Analysing the Potential of Large Language Models for Learning Conceptual Spaces

Usashi Chatterjee, Amit Gajbhiye, Steven Schockaert


Abstract
The theory of Conceptual Spaces is an influential cognitive-linguistic framework for representing the meaning of concepts. Conceptual spaces are constructed from a set of quality dimensions, which essentially correspond to primitive perceptual features (e.g. hue or size). These quality dimensions are usually learned from human judgements, which means that applications of conceptual spaces tend to be limited to narrow domains (e.g. modelling colour or taste). Encouraged by recent findings about the ability of Large Language Models (LLMs) to learn perceptually grounded representations, we explore the potential of such models for learning conceptual spaces. Our experiments show that LLMs can indeed be used for learning meaningful representations to some extent. However, we also find that fine-tuned models of the BERT family are able to match or even outperform the largest GPT-3 model, despite being 2 to 3 orders of magnitude smaller.
Anthology ID:
2023.emnlp-main.725
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11836–11842
Language:
URL:
https://aclanthology.org/2023.emnlp-main.725
DOI:
10.18653/v1/2023.emnlp-main.725
Bibkey:
Cite (ACL):
Usashi Chatterjee, Amit Gajbhiye, and Steven Schockaert. 2023. Cabbage Sweeter than Cake? Analysing the Potential of Large Language Models for Learning Conceptual Spaces. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 11836–11842, Singapore. Association for Computational Linguistics.
Cite (Informal):
Cabbage Sweeter than Cake? Analysing the Potential of Large Language Models for Learning Conceptual Spaces (Chatterjee et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.725.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.725.mp4