Revisiting Pretraining with Adapters

Seungwon Kim, Alex Shum, Nathan Susanj, Jonathan Hilgart


Abstract
Pretrained language models have served as the backbone for many state-of-the-art NLP results. These models are large and expensive to train. Recent work suggests that continued pretraining on task-specific data is worth the effort as pretraining leads to improved performance on downstream tasks. We explore alternatives to full-scale task-specific pretraining of language models through the use of adapter modules, a parameter-efficient approach to transfer learning. We find that adapter-based pretraining is able to achieve comparable results to task-specific pretraining while using a fraction of the overall trainable parameters. We further explore direct use of adapters without pretraining and find that the direct fine-tuning performs mostly on par with pretrained adapter models, contradicting previously proposed benefits of continual pretraining in full pretraining fine-tuning strategies. Lastly, we perform an ablation study on task-adaptive pretraining to investigate how different hyperparameter settings can change the effectiveness of the pretraining.
Anthology ID:
2021.repl4nlp-1.11
Volume:
Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Anna Rogers, Iacer Calixto, Ivan Vulić, Naomi Saphra, Nora Kassner, Oana-Maria Camburu, Trapit Bansal, Vered Shwartz
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
90–99
Language:
URL:
https://aclanthology.org/2021.repl4nlp-1.11
DOI:
10.18653/v1/2021.repl4nlp-1.11
Bibkey:
Cite (ACL):
Seungwon Kim, Alex Shum, Nathan Susanj, and Jonathan Hilgart. 2021. Revisiting Pretraining with Adapters. In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021), pages 90–99, Online. Association for Computational Linguistics.
Cite (Informal):
Revisiting Pretraining with Adapters (Kim et al., RepL4NLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.repl4nlp-1.11.pdf
Optional supplementary material:
 2021.repl4nlp-1.11.OptionalSupplementaryMaterial.zip
Data
AG NewsIMDb Movie ReviewsSciERC