Small Pre-trained Language Models Can be Fine-tuned as Large Models via Over-Parameterization Ze-Feng Gao author Kun Zhou author Peiyu Liu author Wayne Xin Zhao author Ji-Rong Wen author 2023-07 text Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Anna Rogers editor Jordan Boyd-Graber editor Naoaki Okazaki editor Association for Computational Linguistics Toronto, Canada conference publication gao-etal-2023-small 10.18653/v1/2023.acl-long.212 https://aclanthology.org/2023.acl-long.212/ 2023-07 3819 3834