Prompterator: Iterate Efficiently towards More Effective Prompts

Samuel Sučik, Daniel Skala, Andrej Švec, Peter Hraška, Marek Šuppa


Abstract
With the advent of Large Language Models (LLMs) the process known as prompting, which entices the LLM to solve an arbitrary language processing task without the need for finetuning, has risen to prominence. Finding well-performing prompts, however, is a non-trivial task which requires experimentation in order to arrive at a prompt that solves a specific task. When a given task does not readily reduce to one that can be easily measured with well established metrics, human evaluation of the results obtained by prompting is often necessary. In this work we present prompterator, a tool that helps the user interactively iterate over various potential prompts and choose the best performing one based on human feedback. It is distributed as an open source package with out-of-the-box support for various LLM providers and was designed to be easily extensible.
Anthology ID:
2023.emnlp-demo.43
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
December
Year:
2023
Address:
Singapore
Editors:
Yansong Feng, Els Lefever
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
471–478
Language:
URL:
https://aclanthology.org/2023.emnlp-demo.43
DOI:
10.18653/v1/2023.emnlp-demo.43
Bibkey:
Cite (ACL):
Samuel Sučik, Daniel Skala, Andrej Švec, Peter Hraška, and Marek Šuppa. 2023. Prompterator: Iterate Efficiently towards More Effective Prompts. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 471–478, Singapore. Association for Computational Linguistics.
Cite (Informal):
Prompterator: Iterate Efficiently towards More Effective Prompts (Sučik et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-demo.43.pdf