Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks Suchin Gururangan author Ana Marasović author Swabha Swayamdipta author Kyle Lo author Iz Beltagy author Doug Downey author Noah A Smith author 2020-07 text Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics Dan Jurafsky editor Joyce Chai editor Natalie Schluter editor Joel Tetreault editor Association for Computational Linguistics Online conference publication gururangan-etal-2020-dont 10.18653/v1/2020.acl-main.740 https://aclanthology.org/2020.acl-main.740/ 2020-07 8342 8360