Survival of the Most Influential Prompts: Efficient Black-Box Prompt Search via Clustering and Pruning

Han Zhou, Xingchen Wan, Ivan Vulić, Anna Korhonen


Abstract
Prompt-based learning has been an effective paradigm for large pretrained language models (LLM), enabling few-shot or even zero-shot learning. Black-box prompt search has received growing interest recently for its distinctive properties of gradient-free optimization, proven particularly useful and powerful for model-as-a-service usage. However, the discrete nature and the complexity of combinatorial optimization hinder the efficiency of modern black-box approaches. Despite extensive research on search algorithms, the crucial aspect of search space design and optimization has been largely overlooked. In this paper, we first conduct a sensitivity analysis by prompting LLM, revealing that only a small number of tokens exert a disproportionate amount of influence on LLM predictions. Leveraging this insight, we propose the Clustering and Pruning for Efficient Black-box Prompt Search (ClaPS), a simple black-box search method that first clusters and prunes the search space to focus exclusively on influential prompt tokens. By employing even simple search methods within the pruned search space, ClaPS achieves state-of-the-art performance across various tasks and LLMs, surpassing the performance of complex approaches while significantly reducing search costs. Our findings underscore the critical role of search space design and optimization in enhancing both the usefulness and the efficiency of black-box prompt-based learning.
Anthology ID:
2023.findings-emnlp.870
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13064–13077
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.870
DOI:
10.18653/v1/2023.findings-emnlp.870
Bibkey:
Cite (ACL):
Han Zhou, Xingchen Wan, Ivan Vulić, and Anna Korhonen. 2023. Survival of the Most Influential Prompts: Efficient Black-Box Prompt Search via Clustering and Pruning. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13064–13077, Singapore. Association for Computational Linguistics.
Cite (Informal):
Survival of the Most Influential Prompts: Efficient Black-Box Prompt Search via Clustering and Pruning (Zhou et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.870.pdf