Pre-trained Language Models Can be Fully Zero-Shot Learners Xuandong Zhao author Siqi Ouyang author Zhiguo Yu author Ming Wu author Lei Li author 2023-07 text Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Anna Rogers editor Jordan Boyd-Graber editor Naoaki Okazaki editor Association for Computational Linguistics Toronto, Canada conference publication zhao-etal-2023-pre 10.18653/v1/2023.acl-long.869 https://aclanthology.org/2023.acl-long.869/ 2023-07 15590 15606