Follow the Wisdom of the Crowd: Effective Text Generation via Minimum Bayes Risk Decoding

Mirac Suzgun, Luke Melas-Kyriazi, Dan Jurafsky


Abstract
In open-ended natural-language generation, existing text decoding methods typically struggle to produce text which is both diverse and high-quality. Greedy and beam search are known to suffer from text degeneration and linguistic diversity issues, while temperature, top-k, and nucleus sampling yield diverse but often lower-quality outputs. In this work, we build upon Minimum Bayes Risk Decoding (MBRD), a family of decoding methods based on Bayesian risk minimization, to address this diversity-quality trade-off. Inspired by the principle of the wisdom of the crowd, MBRD seeks to select a candidate from a pool of candidates that has the least expected risk under a generative model according to a given utility function. The crowd of candidates serves as an approximation for the distribution over human-generated references. We show that MBRD generalizes numerous decoding methods, including majority voting, and can be used as a drop-in replacement for existing sampling methods. Across a wide range of tasks—such as summarization, data-to-text, translation, and textual style transfer—MBRD yields 3-7 ROUGE and BLEU point improvements, including state-of-the-art results on WebNLG and WMT’16.
Anthology ID:
2023.findings-acl.262
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4265–4293
Language:
URL:
https://aclanthology.org/2023.findings-acl.262
DOI:
10.18653/v1/2023.findings-acl.262
Bibkey:
Cite (ACL):
Mirac Suzgun, Luke Melas-Kyriazi, and Dan Jurafsky. 2023. Follow the Wisdom of the Crowd: Effective Text Generation via Minimum Bayes Risk Decoding. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4265–4293, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Follow the Wisdom of the Crowd: Effective Text Generation via Minimum Bayes Risk Decoding (Suzgun et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.262.pdf