Pretraining with Artificial Language: Studying Transferable Knowledge in Language Models Ryokan Ri author Yoshimasa Tsuruoka author 2022-05 text Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Smaranda Muresan editor Preslav Nakov editor Aline Villavicencio editor Association for Computational Linguistics Dublin, Ireland conference publication ri-tsuruoka-2022-pretraining 10.18653/v1/2022.acl-long.504 https://aclanthology.org/2022.acl-long.504/ 2022-05 7302 7315