On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model Seongjin Shin author Sang-Woo Lee author Hwijeen Ahn author Sungdong Kim author HyoungSeok Kim author Boseop Kim author Kyunghyun Cho author Gichang Lee author Woomyoung Park author Jung-Woo Ha author Nako Sung author 2022-07 text Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Marine Carpuat editor Marie-Catherine de Marneffe editor Ivan Vladimir Meza Ruiz editor Association for Computational Linguistics Seattle, United States conference publication shin-etal-2022-effect 10.18653/v1/2022.naacl-main.380 https://aclanthology.org/2022.naacl-main.380/ 2022-07 5168 5186