Investigating Semantic Subspaces of Transformer Sentence Embeddings through Linear Structural Probing

Dmitry Nikolaev, Sebastian Padó


Abstract
The question of what kinds of linguistic information are encoded in different layers of Transformer-based language models is of considerable interest for the NLP community. Existing work, however, has overwhelmingly focused on word-level representations and encoder-only language models with the masked-token training objective. In this paper, we present experiments with semantic structural probing, a method for studying sentence-level representations via finding a subspace of the embedding space that provides suitable task-specific pairwise distances between data-points. We apply our method to language models from different families (encoder-only, decoder-only, encoder-decoder) and of different sizes in the context of two tasks, semantic textual similarity and natural-language inference. We find that model families differ substantially in their performance and layer dynamics, but that the results are largely model-size invariant.
Anthology ID:
2023.blackboxnlp-1.11
Volume:
Proceedings of the 6th BlackboxNLP Workshop: Analyzing and Interpreting Neural Networks for NLP
Month:
December
Year:
2023
Address:
Singapore
Editors:
Yonatan Belinkov, Sophie Hao, Jaap Jumelet, Najoung Kim, Arya McCarthy, Hosein Mohebbi
Venues:
BlackboxNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
142–154
Language:
URL:
https://aclanthology.org/2023.blackboxnlp-1.11
DOI:
10.18653/v1/2023.blackboxnlp-1.11
Bibkey:
Cite (ACL):
Dmitry Nikolaev and Sebastian Padó. 2023. Investigating Semantic Subspaces of Transformer Sentence Embeddings through Linear Structural Probing. In Proceedings of the 6th BlackboxNLP Workshop: Analyzing and Interpreting Neural Networks for NLP, pages 142–154, Singapore. Association for Computational Linguistics.
Cite (Informal):
Investigating Semantic Subspaces of Transformer Sentence Embeddings through Linear Structural Probing (Nikolaev & Padó, BlackboxNLP-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.blackboxnlp-1.11.pdf