Faster Minimum Bayes Risk Decoding with Confidence-based Pruning

Julius Cheng, Andreas Vlachos


Abstract
Minimum Bayes risk (MBR) decoding outputs the hypothesis with the highest expected utility over the model distribution for some utility function. It has been shown to improve accuracy over beam search in conditional language generation problems and especially neural machine translation, in both human and automatic evaluations. However, the standard sampling-based algorithm for MBR is substantially more computationally expensive than beam search, requiring a large number of samples as well as a quadratic number of calls to the utility function, limiting its applicability. We describe an algorithm for MBR which gradually grows the number of samples used to estimate the utility while pruning hypotheses that are unlikely to have the highest utility according to confidence estimates obtained with bootstrap sampling. Our method requires fewer samples and drastically reduces the number of calls to the utility function compared to standard MBR while being statistically indistinguishable in terms of accuracy. We demonstrate the effectiveness of our approach in experiments on three language pairs, using chrF++ and COMET as utility/evaluation metrics.
Anthology ID:
2023.emnlp-main.767
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12473–12480
Language:
URL:
https://aclanthology.org/2023.emnlp-main.767
DOI:
10.18653/v1/2023.emnlp-main.767
Bibkey:
Cite (ACL):
Julius Cheng and Andreas Vlachos. 2023. Faster Minimum Bayes Risk Decoding with Confidence-based Pruning. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 12473–12480, Singapore. Association for Computational Linguistics.
Cite (Informal):
Faster Minimum Bayes Risk Decoding with Confidence-based Pruning (Cheng & Vlachos, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.767.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.767.mp4