MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers

Mohammadmahdi Nouriborji, Omid Rohanian, Samaneh Kouchaki, David A. Clifton


Abstract
Pre-trained Language Models (LMs) have become an integral part of Natural Language Processing (NLP) in recent years, due to their superior performance in downstream applications. In spite of this resounding success, the usability of LMs is constrained by computational and time complexity, along with their increasing size; an issue that has been referred to as overparameterisation. Different strategies have been proposed in the literature to alleviate these problems, with the aim to create effective compact models that nearly match the performance of their bloated counterparts with negligible performance losses. One of the most popular techniques in this area of research is model distillation. Another potent but underutilised technique is cross-layer parameter sharing. In this work, we combine these two strategies and present MiniALBERT, a technique for converting the knowledge of fully parameterised LMs (such as BERT) into a compact recursive student. In addition, we investigate the application of bottleneck adapters for layer-wise adaptation of our recursive student, and also explore the efficacy of adapter tuning for fine-tuning of compact models. We test our proposed models on a number of general and biomedical NLP tasks to demonstrate their viability and compare them with the state-of-the-art and other existing compact models. All the codes used in the experiments and the pre-trained compact models will be made publicly available.
Anthology ID:
2023.eacl-main.83
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1161–1173
Language:
URL:
https://aclanthology.org/2023.eacl-main.83
DOI:
10.18653/v1/2023.eacl-main.83
Bibkey:
Cite (ACL):
Mohammadmahdi Nouriborji, Omid Rohanian, Samaneh Kouchaki, and David A. Clifton. 2023. MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1161–1173, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers (Nouriborji et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.83.pdf
Video:
 https://aclanthology.org/2023.eacl-main.83.mp4