Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection Chenglong Wang author Yi Lu author Yongyu Mu author Yimin Hu author Tong Xiao author Jingbo Zhu author 2022-12 text Findings of the Association for Computational Linguistics: EMNLP 2022 Yoav Goldberg editor Zornitsa Kozareva editor Yue Zhang editor Association for Computational Linguistics Abu Dhabi, United Arab Emirates conference publication wang-etal-2022-improved 10.18653/v1/2022.findings-emnlp.464 https://aclanthology.org/2022.findings-emnlp.464/ 2022-12 6232 6244