Bridging Information-Theoretic and Geometric Compression in Language Models

Emily Cheng, Corentin Kervadec, Marco Baroni


Abstract
For a language model (LM) to faithfully model human language, it must compress vast, potentially infinite information into relatively few dimensions. We propose analyzing compression in (pre-trained) LMs from two points of view: geometric and information-theoretic. We demonstrate that the two views are highly correlated, such that the intrinsic geometric dimension of linguistic data predicts their coding length under the LM. We then show that, in turn, high compression of a linguistic dataset predicts rapid adaptation to that dataset, confirming that being able to compress linguistic information is an important part of successful LM performance. As a practical byproduct of our analysis, we evaluate a battery of intrinsic dimension estimators for the first time on linguistic data, showing that only some encapsulate the relationship between information-theoretic compression, geometric compression, and ease-of-adaptation.
Anthology ID:
2023.emnlp-main.762
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12397–12420
Language:
URL:
https://aclanthology.org/2023.emnlp-main.762
DOI:
10.18653/v1/2023.emnlp-main.762
Bibkey:
Cite (ACL):
Emily Cheng, Corentin Kervadec, and Marco Baroni. 2023. Bridging Information-Theoretic and Geometric Compression in Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 12397–12420, Singapore. Association for Computational Linguistics.
Cite (Informal):
Bridging Information-Theoretic and Geometric Compression in Language Models (Cheng et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.762.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.762.mp4