Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora Xisen Jin author Dejiao Zhang author Henghui Zhu author Wei Xiao author Shang-Wen Li author Xiaokai Wei author Andrew Arnold author Xiang Ren author 2022-05 text Proceedings of BigScience Episode #5 – Workshop on Challenges & Perspectives in Creating Large Language Models Angela Fan editor Suzana Ilic editor Thomas Wolf editor Matthias Gallé editor Association for Computational Linguistics virtual+Dublin conference publication jin-etal-2022-lifelong 10.18653/v1/2022.bigscience-1.1 https://aclanthology.org/2022.bigscience-1.1/ 2022-05 1 16