MetaICL: Learning to Learn In Context

Sewon Min, Mike Lewis, Luke Zettlemoyer, Hannaneh Hajishirzi


Abstract
We introduce MetaICL (Meta-training for In-Context Learning), a new meta-training framework for few-shot learning where a pretrained language model is tuned to do in-context learning on a large set of training tasks. This meta-training enables the model to more effectively learn a new task in context at test time, by simply conditioning on a few training examples with no parameter updates or task-specific templates. We experiment on a large, diverse collection of tasks consisting of 142 NLP datasets including classification, question answering, natural language inference, paraphrase detection and more, across seven different meta-training/target splits. MetaICL outperforms a range of baselines including in-context learning without meta-training and multi-task learning followed by zero-shot transfer. We find that the gains are particularly significant for target tasks that have domain shifts from the meta-training tasks, and that using a diverse set of the meta-training tasks is key to improvements. We also show that MetaICL approaches (and sometimes beats) the performance of models fully finetuned on the target task training data, and outperforms much bigger models with nearly 8x parameters.
Anthology ID:
2022.naacl-main.201
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2791–2809
Language:
URL:
https://aclanthology.org/2022.naacl-main.201
DOI:
10.18653/v1/2022.naacl-main.201
Bibkey:
Cite (ACL):
Sewon Min, Mike Lewis, Luke Zettlemoyer, and Hannaneh Hajishirzi. 2022. MetaICL: Learning to Learn In Context. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 2791–2809, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
MetaICL: Learning to Learn In Context (Min et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.201.pdf
Video:
 https://aclanthology.org/2022.naacl-main.201.mp4
Code
 facebookresearch/metaicl +  additional community code
Data
Hate SpeechNatural Instructions