Cutting Down on Prompts and Parameters: Simple Few-Shot Learning with Language Models

Robert Logan IV, Ivana Balazevic, Eric Wallace, Fabio Petroni, Sameer Singh, Sebastian Riedel


Abstract
Prompting language models (LMs) with training examples and task descriptions has been seen as critical to recent successes in few-shot learning. In this work, we show that finetuning LMs in the few-shot setting can considerably reduce the need for prompt engineering. In fact, one can use null prompts, prompts that contain neither task-specific templates nor training examples, and achieve competitive accuracy to manually-tuned prompts across a wide range of tasks. While finetuning LMs does introduce new parameters for each downstream task, we show that this memory overhead can be substantially reduced: finetuning only the bias terms can achieve comparable or better accuracy than standard finetuning while only updating 0.1% of the parameters. All in all, we recommend finetuning LMs for few-shot learning as it is more accurate, robust to different prompts, and can be made nearly as efficient as using frozen LMs.
Anthology ID:
2022.findings-acl.222
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2824–2835
Language:
URL:
https://aclanthology.org/2022.findings-acl.222
DOI:
10.18653/v1/2022.findings-acl.222
Bibkey:
Cite (ACL):
Robert Logan IV, Ivana Balazevic, Eric Wallace, Fabio Petroni, Sameer Singh, and Sebastian Riedel. 2022. Cutting Down on Prompts and Parameters: Simple Few-Shot Learning with Language Models. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2824–2835, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Cutting Down on Prompts and Parameters: Simple Few-Shot Learning with Language Models (Logan IV et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.222.pdf
Software:
 2022.findings-acl.222.software.zip
Video:
 https://aclanthology.org/2022.findings-acl.222.mp4
Code
 ucinlp/null-prompts +  additional community code
Data
GLUEQNLI