Weight-Inherited Distillation for Task-Agnostic BERT Compression

Taiqiang Wu, Cheng Hou, Shanshan Lao, Jiayi Li, Ngai Wong, Zhe Zhao, Yujiu Yang


Abstract
Knowledge Distillation (KD) is a predominant approach for BERT compression.Previous KD-based methods focus on designing extra alignment losses for the student model to mimic the behavior of the teacher model.These methods transfer the knowledge in an indirect way.In this paper, we propose a novel Weight-Inherited Distillation (WID), which directly transfers knowledge from the teacher.WID does not require any additional alignment loss and trains a compact student by inheriting the weights, showing a new perspective of knowledge distillation.Specifically, we design the row compactors and column compactors as mappings and then compress the weights via structural re-parameterization.Experimental results on the GLUE and SQuAD benchmarks show that WID outperforms previous state-of-the-art KD-based baselines.Further analysis indicates that WID can also learn the attention patterns from the teacher model without any alignment loss on attention distributions.The code is available at https://github.com/wutaiqiang/WID-NAACL2024.
Anthology ID:
2024.findings-naacl.2
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13–28
Language:
URL:
https://aclanthology.org/2024.findings-naacl.2
DOI:
Bibkey:
Cite (ACL):
Taiqiang Wu, Cheng Hou, Shanshan Lao, Jiayi Li, Ngai Wong, Zhe Zhao, and Yujiu Yang. 2024. Weight-Inherited Distillation for Task-Agnostic BERT Compression. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 13–28, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Weight-Inherited Distillation for Task-Agnostic BERT Compression (Wu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.2.pdf
Copyright:
 2024.findings-naacl.2.copyright.pdf