bert2BERT: Towards Reusable Pretrained Language Models Cheng Chen author Yichun Yin author Lifeng Shang author Xin Jiang author Yujia Qin author Fengyu Wang author Zhi Wang author Xiao Chen author Zhiyuan Liu author Qun Liu author 2022-05 text Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Smaranda Muresan editor Preslav Nakov editor Aline Villavicencio editor Association for Computational Linguistics Dublin, Ireland conference publication chen-etal-2022-bert2bert 10.18653/v1/2022.acl-long.151 https://aclanthology.org/2022.acl-long.151/ 2022-05 2134 2148