Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model

Haikang Deng, Colin Raffel


Abstract
While large language models have proven effective in a huge range of downstream applications, they often generate text that is problematic or lacks a desired attribute. In this paper, we introduce Reward-Augmented Decoding (RAD), a text generation procedure that uses a small unidirectional reward model to encourage a language model to generate text that has certain properties. Specifically, RAD uses the reward model to score generations as they are produced and rescales sampling probabilities to favor high-reward tokens. By using a unidirectional reward model, RAD can cache activations from prior generation steps to decrease computational overhead. Through experiments on generating non-toxic and sentiment-controlled text, we demonstrate that RAD performs best among methods that change only the generation procedure and matches the performance of state-of-the-art methods that involve re-training the language model. We further validate that RAD is effective on very large language models while incurring a minimal computational overhead.
Anthology ID:
2023.emnlp-main.721
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11781–11791
Language:
URL:
https://aclanthology.org/2023.emnlp-main.721
DOI:
10.18653/v1/2023.emnlp-main.721
Bibkey:
Cite (ACL):
Haikang Deng and Colin Raffel. 2023. Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 11781–11791, Singapore. Association for Computational Linguistics.
Cite (Informal):
Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model (Deng & Raffel, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.721.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.721.mp4