Attribute Alignment: Controlling Text Generation from Pre-trained Language Models

Dian Yu, Zhou Yu, Kenji Sagae


Abstract
Large language models benefit from training with a large amount of unlabeled text, which gives them increasingly fluent and diverse generation capabilities. However, using these models for text generation that takes into account target attributes, such as sentiment polarity or specific topics, remains a challenge. We propose a simple and flexible method for controlling text generation by aligning disentangled attribute representations. In contrast to recent efforts on training a discriminator to perturb the token level distribution for an attribute, we use the same data to learn an alignment function to guide the pre-trained, non-controlled language model to generate texts with the target attribute without changing the original language model parameters. We evaluate our method on sentiment- and topic-controlled generation, and show large performance gains over previous methods while retaining fluency and diversity.
Anthology ID:
2021.findings-emnlp.194
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2251–2268
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.194
DOI:
10.18653/v1/2021.findings-emnlp.194
Bibkey:
Cite (ACL):
Dian Yu, Zhou Yu, and Kenji Sagae. 2021. Attribute Alignment: Controlling Text Generation from Pre-trained Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2251–2268, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Attribute Alignment: Controlling Text Generation from Pre-trained Language Models (Yu et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.194.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.194.mp4
Code
 diandyu/attribute_alignment
Data
AG NewsIMDb Movie ReviewsSST