From Word Types to Tokens and Back: A Survey of Approaches to Word Meaning Representation and Interpretation

Marianna Apidianaki


Abstract
Vector-based word representation paradigms situate lexical meaning at different levels of abstraction. Distributional and static embedding models generate a single vector per word type, which is an aggregate across the instances of the word in a corpus. Contextual language models, on the contrary, directly capture the meaning of individual word instances. The goal of this survey is to provide an overview of word meaning representation methods, and of the strategies that have been proposed for improving the quality of the generated vectors. These often involve injecting external knowledge about lexical semantic relationships, or refining the vectors to describe different senses. The survey also covers recent approaches for obtaining word type-level representations from token-level ones, and for combining static and contextualized representations. Special focus is given to probing and interpretation studies aimed at discovering the lexical semantic knowledge that is encoded in contextualized representations. The challenges posed by this exploration have motivated the interest towards static embedding derivation from contextualized embeddings, and for methods aimed at improving the similarity estimates that can be drawn from the space of contextual language models.
Anthology ID:
2023.cl-2.7
Volume:
Computational Linguistics, Volume 49, Issue 2 - June 2023
Month:
June
Year:
2023
Address:
Cambridge, MA
Venue:
CL
SIG:
Publisher:
MIT Press
Note:
Pages:
465–523
Language:
URL:
https://aclanthology.org/2023.cl-2.7
DOI:
10.1162/coli_a_00474
Bibkey:
Cite (ACL):
Marianna Apidianaki. 2023. From Word Types to Tokens and Back: A Survey of Approaches to Word Meaning Representation and Interpretation. Computational Linguistics, 49(2):465–523.
Cite (Informal):
From Word Types to Tokens and Back: A Survey of Approaches to Word Meaning Representation and Interpretation (Apidianaki, CL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.cl-2.7.pdf