FUDGE: Controlled Text Generation With Future Discriminators

Kevin Yang, Dan Klein


Abstract
We propose Future Discriminators for Generation (FUDGE), a flexible and modular method for controlled text generation. Given a pre-existing model G for generating text from a distribution of interest, FUDGE enables conditioning on a desired attribute a (for example, formality) while requiring access only to G’s output logits. FUDGE learns an attribute predictor operating on a partial sequence, and uses this predictor’s outputs to adjust G’s original probabilities. We show that FUDGE models terms corresponding to a Bayesian decomposition of the conditional distribution of G given attribute a. Moreover, FUDGE can easily compose predictors for multiple desired attributes. We evaluate FUDGE on three tasks — couplet completion in poetry, topic control in language generation, and formality change in machine translation — and observe gains in all three tasks.
Anthology ID:
2021.naacl-main.276
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3511–3535
Language:
URL:
https://aclanthology.org/2021.naacl-main.276
DOI:
10.18653/v1/2021.naacl-main.276
Bibkey:
Cite (ACL):
Kevin Yang and Dan Klein. 2021. FUDGE: Controlled Text Generation With Future Discriminators. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3511–3535, Online. Association for Computational Linguistics.
Cite (Informal):
FUDGE: Controlled Text Generation With Future Discriminators (Yang & Klein, NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.276.pdf
Code
 yangkevin2/naacl-2021-fudge-controlled-generation +  additional community code
Data
GYAFC