Distilling Linguistic Context for Language Model Compression

Geondo Park, Gyeongman Kim, Eunho Yang


Abstract
A computationally expensive and memory intensive neural network lies behind the recent success of language representation learning. Knowledge distillation, a major technique for deploying such a vast language model in resource-scarce environments, transfers the knowledge on individual word representations learned without restrictions. In this paper, inspired by the recent observations that language representations are relatively positioned and have more semantic knowledge as a whole, we present a new knowledge distillation objective for language representation learning that transfers the contextual knowledge via two types of relationships across representations: Word Relation and Layer Transforming Relation. Unlike other recent distillation techniques for the language models, our contextual distillation does not have any restrictions on architectural changes between teacher and student. We validate the effectiveness of our method on challenging benchmarks of language understanding tasks, not only in architectures of various sizes but also in combination with DynaBERT, the recently proposed adaptive size pruning method.
Anthology ID:
2021.emnlp-main.30
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
364–378
Language:
URL:
https://aclanthology.org/2021.emnlp-main.30
DOI:
10.18653/v1/2021.emnlp-main.30
Bibkey:
Cite (ACL):
Geondo Park, Gyeongman Kim, and Eunho Yang. 2021. Distilling Linguistic Context for Language Model Compression. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 364–378, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Distilling Linguistic Context for Language Model Compression (Park et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.30.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.30.mp4
Code
 geondopark/ckd
Data
GLUEQNLISQuAD