Unnatural Instructions: Tuning Language Models with (Almost) No Human Labor

Or Honovich, Thomas Scialom, Omer Levy, Timo Schick


Abstract
Instruction tuning enables pretrained language models to perform new tasks from inference-time natural language descriptions. These approaches rely on vast amounts of human supervision in the form of crowdsourced datasets or user interactions. In this work, we introduce Unnatural Instructions: a large dataset of creative and diverse instructions, collected with virtually no human labor. We collect 64,000 examples by prompting a language model with three seed examples of instructions and eliciting a fourth. This set is then expanded by prompting the model to rephrase each instruction, creating a total of approximately 240,000 examples of instructions, inputs, and outputs. Experiments show that despite containing a fair amount of noise, training on Unnatural Instructions rivals the effectiveness of training on open-source manually-curated datasets, surpassing the performance of models such as T0++ and Tk-Instruct across various benchmarks. These results demonstrate the potential of model-generated data as a cost-effective alternative to crowdsourcing for dataset expansion and diversification.
Anthology ID:
2023.acl-long.806
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14409–14428
Language:
URL:
https://aclanthology.org/2023.acl-long.806
DOI:
10.18653/v1/2023.acl-long.806
Bibkey:
Cite (ACL):
Or Honovich, Thomas Scialom, Omer Levy, and Timo Schick. 2023. Unnatural Instructions: Tuning Language Models with (Almost) No Human Labor. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14409–14428, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Unnatural Instructions: Tuning Language Models with (Almost) No Human Labor (Honovich et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.806.pdf
Video:
 https://aclanthology.org/2023.acl-long.806.mp4