Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains Yunzhi Yao author Shaohan Huang author Wenhui Wang author Li Dong author Furu Wei author 2021-08 text Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 Chengqing Zong editor Fei Xia editor Wenjie Li editor Roberto Navigli editor Association for Computational Linguistics Online conference publication yao-etal-2021-adapt 10.18653/v1/2021.findings-acl.40 https://aclanthology.org/2021.findings-acl.40/ 2021-08 460 470