Statistical Depth for Ranking and Characterizing Transformer-Based Text Embeddings

Parker Seegmiller, Sarah Preum


Abstract
The popularity of transformer-based text embeddings calls for better statistical tools for measuring distributions of such embeddings. One such tool would be a method for ranking texts within a corpus by centrality, i.e. assigning each text a number signifying how representative that text is of the corpus as a whole. However, an intrinsic center-outward ordering of high-dimensional text representations is not trivial. A statistical depth is a function for ranking k-dimensional objects by measuring centrality with respect to some observed k-dimensional distribution. We adopt a statistical depth to measure distributions of transformer-based text embeddings, transformer-based text embedding (TTE) depth, and introduce the practical use of this depth for both modeling and distributional inference in NLP pipelines. We first define TTE depth and an associated rank sum test for determining whether two corpora differ significantly in embedding space. We then use TTE depth for the task of in-context learning prompt selection, showing that this approach reliably improves performance over statistical baseline approaches across six text classification tasks. Finally, we use TTE depth and the associated rank sum test to characterize the distributions of synthesized and human-generated corpora, showing that five recent synthetic data augmentation processes cause a measurable distributional shift away from associated human-generated text.
Anthology ID:
2023.emnlp-main.596
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9600–9611
Language:
URL:
https://aclanthology.org/2023.emnlp-main.596
DOI:
10.18653/v1/2023.emnlp-main.596
Bibkey:
Cite (ACL):
Parker Seegmiller and Sarah Preum. 2023. Statistical Depth for Ranking and Characterizing Transformer-Based Text Embeddings. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 9600–9611, Singapore. Association for Computational Linguistics.
Cite (Informal):
Statistical Depth for Ranking and Characterizing Transformer-Based Text Embeddings (Seegmiller & Preum, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.596.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.596.mp4