Direction is what you need: Improving Word Embedding Compression in Large Language Models

Klaudia Bałazy, Mohammadreza Banaei, Rémi Lebret, Jacek Tabor, Karl Aberer


Abstract
The adoption of Transformer-based models in natural language processing (NLP) has led to great success using a massive number of parameters. However, due to deployment constraints in edge devices, there has been a rising interest in the compression of these models to improve their inference time and memory footprint. This paper presents a novel loss objective to compress token embeddings in the Transformer-based models by leveraging an AutoEncoder architecture. More specifically, we emphasize the importance of the direction of compressed embeddings with respect to original uncompressed embeddings. The proposed method is task-agnostic and does not require further language modeling pre-training. Our method significantly outperforms the commonly used SVD-based matrix-factorization approach in terms of initial language model Perplexity. Moreover, we evaluate our proposed approach over SQuAD v1.1 dataset and several downstream tasks from the GLUE benchmark, where we also outperform the baseline in most scenarios. Our code is public.
Anthology ID:
2021.repl4nlp-1.32
Original:
2021.repl4nlp-1.32v1
Version 2:
2021.repl4nlp-1.32v2
Volume:
Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Anna Rogers, Iacer Calixto, Ivan Vulić, Naomi Saphra, Nora Kassner, Oana-Maria Camburu, Trapit Bansal, Vered Shwartz
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
322–330
Language:
URL:
https://aclanthology.org/2021.repl4nlp-1.32
DOI:
10.18653/v1/2021.repl4nlp-1.32
Bibkey:
Cite (ACL):
Klaudia Bałazy, Mohammadreza Banaei, Rémi Lebret, Jacek Tabor, and Karl Aberer. 2021. Direction is what you need: Improving Word Embedding Compression in Large Language Models. In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021), pages 322–330, Online. Association for Computational Linguistics.
Cite (Informal):
Direction is what you need: Improving Word Embedding Compression in Large Language Models (Bałazy et al., RepL4NLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.repl4nlp-1.32.pdf
Code
 MohammadrezaBanaei/orientation_based_embedding_compression
Data
GLUEMRPCSQuADSSTSST-2