Language Models Implement Simple Word2Vec-style Vector Arithmetic

Jack Merullo, Carsten Eickhoff, Ellie Pavlick


Abstract
A primary criticism towards language models (LMs) is their inscrutability. This paper presents evidence that, despite their size and complexity, LMs sometimes exploit a simple vector arithmetic style mechanism to solve some relational tasks using regularities encoded in the hidden space of the model (e.g., Poland:Warsaw::China:Beijing). We investigate a range of language model sizes (from 124M parameters to 176B parameters) in an in-context learning setting, and find that for a variety of tasks (involving capital cities, uppercasing, and past-tensing) a key part of the mechanism reduces to a simple additive update typically applied by the feedforward (FFN) networks. We further show that this mechanism is specific to tasks that require retrieval from pretraining memory, rather than retrieval from local context. Our results contribute to a growing body of work on the interpretability of LMs, and offer reason to be optimistic that, despite the massive and non-linear nature of the models, the strategies they ultimately use to solve tasks can sometimes reduce to familiar and even intuitive algorithms.
Anthology ID:
2024.naacl-long.281
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5030–5047
Language:
URL:
https://aclanthology.org/2024.naacl-long.281
DOI:
Bibkey:
Cite (ACL):
Jack Merullo, Carsten Eickhoff, and Ellie Pavlick. 2024. Language Models Implement Simple Word2Vec-style Vector Arithmetic. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 5030–5047, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Language Models Implement Simple Word2Vec-style Vector Arithmetic (Merullo et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.281.pdf
Copyright:
 2024.naacl-long.281.copyright.pdf