All Bark and No Bite: Rogue Dimensions in Transformer Language Models Obscure Representational Quality

William Timkey, Marten van Schijndel


Abstract
Similarity measures are a vital tool for understanding how language models represent and process language. Standard representational similarity measures such as cosine similarity and Euclidean distance have been successfully used in static word embedding models to understand how words cluster in semantic space. Recently, these measures have been applied to embeddings from contextualized models such as BERT and GPT-2. In this work, we call into question the informativity of such measures for contextualized language models. We find that a small number of rogue dimensions, often just 1-3, dominate these measures. Moreover, we find a striking mismatch between the dimensions that dominate similarity measures and those which are important to the behavior of the model. We show that simple postprocessing techniques such as standardization are able to correct for rogue dimensions and reveal underlying representational quality. We argue that accounting for rogue dimensions is essential for any similarity-based analysis of contextual language models.
Anthology ID:
2021.emnlp-main.372
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4527–4546
Language:
URL:
https://aclanthology.org/2021.emnlp-main.372
DOI:
10.18653/v1/2021.emnlp-main.372
Bibkey:
Cite (ACL):
William Timkey and Marten van Schijndel. 2021. All Bark and No Bite: Rogue Dimensions in Transformer Language Models Obscure Representational Quality. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4527–4546, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
All Bark and No Bite: Rogue Dimensions in Transformer Language Models Obscure Representational Quality (Timkey & van Schijndel, EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.372.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.372.mp4
Code
 wtimkey/rogue-dimensions