Controlled Text Generation with Hidden Representation Transformations

Vaibhav Kumar, Hana Koorehdavoudi, Masud Moshtaghi, Amita Misra, Ankit Chadha, Emilio Ferrara


Abstract
We propose CHRT (Control HiddenRepresentation Transformation) – a con-trolled language generation framework thatsteers large language models to generatetext pertaining to certain attributes (such astoxicity). CHRT gains attribute control bymodifying the hidden representation of thebase model through learned transformations. We employ a contrastive-learning frameworkto learn these transformations that can becombined to gain multi-attribute control. Theeffectiveness of CHRT is experimentallyshown by comparing it with seven baselinesover three attributes. CHRT outperforms all thebaselines in the task of detoxification, positivesentiment steering, and text simplificationwhile minimizing the loss in linguistic qualities. Further, our approach has the lowest inferencelatency of only 0.01 seconds more than thebase model, making it the most suitable forhigh-performance production environments. We open-source our code and release two noveldatasets to further propel controlled languagegeneration research
Anthology ID:
2023.findings-acl.602
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9440–9455
Language:
URL:
https://aclanthology.org/2023.findings-acl.602
DOI:
10.18653/v1/2023.findings-acl.602
Bibkey:
Cite (ACL):
Vaibhav Kumar, Hana Koorehdavoudi, Masud Moshtaghi, Amita Misra, Ankit Chadha, and Emilio Ferrara. 2023. Controlled Text Generation with Hidden Representation Transformations. In Findings of the Association for Computational Linguistics: ACL 2023, pages 9440–9455, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Controlled Text Generation with Hidden Representation Transformations (Kumar et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.602.pdf
Video:
 https://aclanthology.org/2023.findings-acl.602.mp4