Improving Minimum Bayes Risk Decoding with Multi-Prompt

David Heineman, Yao Dou, Wei Xu


Abstract
While instruction fine-tuned LLMs are effective text generators, sensitivity to prompt construction makes performance unstable and sub-optimal in practice. Relying on a single ‘best’ prompt cannot capture all differing approaches to a generation problem. Using this observation, we propose multi-prompt decoding, where many candidate generations are decoded from a prompt bank at inference-time. To ensemble candidates, we use Minimum Bayes Risk (MBR) decoding, which selects a final output using a trained value metric. We show multi-prompt improves MBR across a comprehensive set of conditional generation tasks, and show this is a result of estimating a more diverse and higher quality candidate space than that of a single prompt. Our experiments confirm multi-prompt improves generation across tasks, models and metrics.
Anthology ID:
2024.emnlp-main.1255
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22525–22545
Language:
URL:
https://aclanthology.org/2024.emnlp-main.1255/
DOI:
10.18653/v1/2024.emnlp-main.1255
Bibkey:
Cite (ACL):
David Heineman, Yao Dou, and Wei Xu. 2024. Improving Minimum Bayes Risk Decoding with Multi-Prompt. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 22525–22545, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Improving Minimum Bayes Risk Decoding with Multi-Prompt (Heineman et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.1255.pdf