Automatic Engineering of Long Prompts

Cho-Jui Hsieh, Si Si, Felix Yu, Inderjit Dhillon


Abstract
Large language models (LLMs) have demonstrated remarkable capabilities in solving complex open-domain tasks, guided by comprehensive instructions and demonstrations provided in the form of prompts. However, these prompts can be lengthy, often comprising hundreds of lines and thousands of tokens, and their design often requires considerable human effort. Recent research has explored automatic prompt engineering for short prompts, typically consisting of one or a few sentences. However, the automatic design of long prompts remains a challenging problem due to its immense search space. In this paper, we propose an algorithm named Automated Prompt Engineering Xpert (APEX), a novel algorithm that automatically improves long prompts. Leveraging a greedy algorithm with beam-search for efficiency, APEX utilizes search history to significantly enhance the effectiveness of LLM-based mutation in its search process. Our results show that APEX achieves an average of 9.2% accuracy gain on eight tasks in Big Bench Hard and a consistent improvements on GSM8K with various models, highlighting the significance of automating prompt designs to fully harness the capabilities of LLMs.
Anthology ID:
2024.findings-acl.634
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10672–10685
Language:
URL:
https://aclanthology.org/2024.findings-acl.634
DOI:
Bibkey:
Cite (ACL):
Cho-Jui Hsieh, Si Si, Felix Yu, and Inderjit Dhillon. 2024. Automatic Engineering of Long Prompts. In Findings of the Association for Computational Linguistics ACL 2024, pages 10672–10685, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Automatic Engineering of Long Prompts (Hsieh et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.634.pdf