Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models

Mohammadreza Banaei, Klaudia Bałazy, Artur Kasymov, Rémi Lebret, Jacek Tabor, Karl Aberer


Abstract
Recent transformer language models achieve outstanding results in many natural language processing (NLP) tasks. However, their enormous size often makes them impractical on memory-constrained devices, requiring practitioners to compress them to smaller networks. In this paper, we explore offline compression methods, meaning computationally-cheap approaches that do not require further fine-tuning of the compressed model. We challenge the classical matrix factorization methods by proposing a novel, better-performing autoencoder-based framework. We perform a comprehensive ablation study of our approach, examining its different aspects over a diverse set of evaluation settings. Moreover, we show that enabling collaboration between modules across layers by compressing certain modules together positively impacts the final model performance. Experiments on various NLP tasks demonstrate that our approach significantly outperforms commonly used factorization-based offline compression methods.
Anthology ID:
2023.findings-eacl.133
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1788–1805
Language:
URL:
https://aclanthology.org/2023.findings-eacl.133
DOI:
10.18653/v1/2023.findings-eacl.133
Bibkey:
Cite (ACL):
Mohammadreza Banaei, Klaudia Bałazy, Artur Kasymov, Rémi Lebret, Jacek Tabor, and Karl Aberer. 2023. Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1788–1805, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models (Banaei et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.133.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.133.mp4