Focused Prefix Tuning for Controllable Text Generation

Congda Ma, Tianyu Zhao, Makoto Shing, Kei Sawada, Manabu Okumura


Abstract
In a controllable text generation dataset, there exist unannotated attributes that could provide irrelevant learning signals to models that use it for training and thus degrade their performance. We propose focused prefix tuning (FPT) to mitigate the problem and to enable the control to focus on the desired attribute. Experimental results show that FPT can achieve better control accuracy and text fluency than baseline models in single-attribute control tasks. In multi-attribute control tasks, FPT achieves comparable control accuracy with the state-of-the-art approach while keeping the flexibility to control new attributes without retraining existing models.
Anthology ID:
2023.acl-short.96
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1116–1127
Language:
URL:
https://aclanthology.org/2023.acl-short.96
DOI:
10.18653/v1/2023.acl-short.96
Bibkey:
Cite (ACL):
Congda Ma, Tianyu Zhao, Makoto Shing, Kei Sawada, and Manabu Okumura. 2023. Focused Prefix Tuning for Controllable Text Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1116–1127, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Focused Prefix Tuning for Controllable Text Generation (Ma et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.96.pdf
Video:
 https://aclanthology.org/2023.acl-short.96.mp4