A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models

Hayeon Lee, Rui Hou, Jongpil Kim, Davis Liang, Sung Ju Hwang, Alexander Min


Abstract
Distillation from Weak Teacher (DWT) is a method of transferring knowledge from a smaller, weaker teacher model to a larger student model to improve its performance. Previous studies have shown that DWT can be effective in the vision domain and natural language processing (NLP) pre-training stage. Specifically, DWT shows promise in practical scenarios, such as enhancing new generation or larger models using pre-trained yet older or smaller models and lacking a resource budget. However, the optimal conditions for using DWT have yet to be fully investigated in NLP pre-training. Therefore, this study examines three key factors to optimize DWT, distinct from those used in the vision domain or traditional knowledge distillation. These factors are:(i) the impact of teacher model quality on DWT effectiveness, (ii) guidelines for adjusting the weighting value for DWT loss, and (iii) the impact of parameter remapping as a student model initialization technique for DWT.
Anthology ID:
2023.findings-acl.714
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11239–11246
Language:
URL:
https://aclanthology.org/2023.findings-acl.714
DOI:
10.18653/v1/2023.findings-acl.714
Bibkey:
Cite (ACL):
Hayeon Lee, Rui Hou, Jongpil Kim, Davis Liang, Sung Ju Hwang, and Alexander Min. 2023. A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models. In Findings of the Association for Computational Linguistics: ACL 2023, pages 11239–11246, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models (Lee et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.714.pdf