Induct-Learn: Short Phrase Prompting with Instruction Induction

Po-Chun Chen, Sheng-Lun Wei, Hen-Hsen Huang, Hsin-Hsi Chen


Abstract
Large Language Models (LLMs) have demonstrated capability in “instruction induction,” generating instructions from demonstrations (input-output pairs). However, existing methods often rely on large datasets or numerous examples, which is impractical and costly in real-world scenarios. In this work, we propose a low-cost, task-level framework called Induct-Learn. It induces pseudo instructions from a few demonstrations and a short phrase, adding a CoT process into existing demonstrations. When encountering new problems, the learned pseudo instructions and demonstrations with the pseudo CoT process can be combined into a prompt to guide the LLM’s problem-solving process. We validate our approach on the BBH-Induct and Evals-Induct datasets, and the results show that the Induct-Learn framework outperforms state-of-the-art methods. We also exhibit cross-model adaptability and achieve superior performance at a lower cost compared to existing methods.
Anthology ID:
2024.emnlp-main.297
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5204–5231
Language:
URL:
https://aclanthology.org/2024.emnlp-main.297
DOI:
Bibkey:
Cite (ACL):
Po-Chun Chen, Sheng-Lun Wei, Hen-Hsen Huang, and Hsin-Hsi Chen. 2024. Induct-Learn: Short Phrase Prompting with Instruction Induction. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 5204–5231, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Induct-Learn: Short Phrase Prompting with Instruction Induction (Chen et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.297.pdf
Data:
 2024.emnlp-main.297.data.zip