LongForm: Effective Instruction Tuning with Reverse Instructions

Abdullatif Köksal, Timo Schick, Anna Korhonen, Hinrich Schuetze


Abstract
Instruction tuning enables language models to more effectively generalize and better follow user intent. However, obtaining instruction data is costly and challenging. Prior work employs methods such as expensive human annotation, crowd-sourced datasets with alignment issues, and generating noisy examples via LLMs. We introduce the LongForm-C dataset, which is created by reverse instructions. We generate instructions via LLMs for human-written corpus examples using reverse instructions. First we select a diverse set of human-written documents from corpora such as C4 and Wikipedia; then we generate instructions for these documents via LLMs. This approach provides a cheaper and cleaner instruction-tuning dataset with natural output and one suitable for long text generation. Our models outperform 10x larger language models without instruction tuning on tasks such as story/recipe generation and long-form question answering. Moreover, LongForm models outperform prior instruction-tuned models such as FLAN-T5 and Alpaca by a large margin, and improve language understanding capabilities further. We publicly release our data and models: [Anonymized-URL].
Anthology ID:
2024.findings-emnlp.414
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7056–7078
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.414
DOI:
10.18653/v1/2024.findings-emnlp.414
Bibkey:
Cite (ACL):
Abdullatif Köksal, Timo Schick, Anna Korhonen, and Hinrich Schuetze. 2024. LongForm: Effective Instruction Tuning with Reverse Instructions. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 7056–7078, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
LongForm: Effective Instruction Tuning with Reverse Instructions (Köksal et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.414.pdf
Software:
 2024.findings-emnlp.414.software.zip
Data:
 2024.findings-emnlp.414.data.zip