Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression Xinge Ma author Jin Wang author Liang-Chih Yu author Xuejie Zhang author 2022-10 text Proceedings of the 29th International Conference on Computational Linguistics Nicoletta Calzolari editor Chu-Ren Huang editor Hansaem Kim editor James Pustejovsky editor Leo Wanner editor Key-Sun Choi editor Pum-Mo Ryu editor Hsin-Hsi Chen editor Lucia Donatelli editor Heng Ji editor Sadao Kurohashi editor Patrizia Paggio editor Nianwen Xue editor Seokhwan Kim editor Younggyun Hahm editor Zhong He editor Tony Kyungil Lee editor Enrico Santus editor Francis Bond editor Seung-Hoon Na editor International Committee on Computational Linguistics Gyeongju, Republic of Korea conference publication ma-etal-2022-knowledge https://aclanthology.org/2022.coling-1.435/ 2022-10 4907 4917