Retrieval-based Knowledge Transfer: An Effective Approach for Extreme Large Language Model Compression

Jiduan Liu, Jiahao Liu, Qifan Wang, Jingang Wang, Xunliang Cai, Dongyan Zhao, Ran Wang, Rui Yan


Abstract
Large-scale pre-trained language models (LLMs) have demonstrated exceptional performance in various natural language processing (NLP) tasks. However, the massive size of these models poses huge challenges for their deployment in real-world applications. While numerous model compression techniques have been proposed, most of them are not well-suited for achieving extreme model compression when there is a significant gap in model scale. In this paper, we introduce a novel compression paradigm called Retrieval-based Knowledge Transfer (RetriKT), which effectively transfers the knowledge of LLMs to extremely small-scale models (e.g., 1%). In particular, our approach extracts knowledge from LLMs to construct a knowledge store, from which the small-scale model can retrieve relevant information and leverage it for effective inference. To improve the quality of the model, soft prompt tuning and Proximal Policy Optimization (PPO) reinforcement learning techniques are employed. Extensive experiments are conducted on low-resource tasks from SuperGLUE and GLUE benchmarks. The results demonstrate that the proposed approach significantly enhances the performance of small-scale models by leveraging the knowledge from LLMs.
Anthology ID:
2023.findings-emnlp.578
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8643–8657
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.578
DOI:
10.18653/v1/2023.findings-emnlp.578
Bibkey:
Cite (ACL):
Jiduan Liu, Jiahao Liu, Qifan Wang, Jingang Wang, Xunliang Cai, Dongyan Zhao, Ran Wang, and Rui Yan. 2023. Retrieval-based Knowledge Transfer: An Effective Approach for Extreme Large Language Model Compression. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8643–8657, Singapore. Association for Computational Linguistics.
Cite (Informal):
Retrieval-based Knowledge Transfer: An Effective Approach for Extreme Large Language Model Compression (Liu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.578.pdf