Prefix-Tuning Based Unsupervised Text Style Transfer

Huiyu Mai, Wenhao Jiang, Zhi-Hong Deng


Abstract
Unsupervised text style transfer aims at training a generative model that can alter the style of the input sentence while preserving its content without using any parallel data. In this paper, we employ powerful pre-trained large language models and present a new prefix-tuning-based method for unsupervised text style transfer. We construct three different kinds of prefixes, i.e., shared prefix, style prefix, and content prefix, to encode task-specific information, target style, and the content information of the input sentence, respectively. Compared to embeddings used by previous works, the proposed prefixes can provide richer information for the model. Furthermore, we adopt a recursive way of using language models in the process of style transfer. This strategy provides a more effective way for the interactions between the input sentence and GPT-2, helps the model construct more informative prefixes, and thus, helps improve the performance. Evaluations on the well-known datasets show that our method outperforms the state-of-the-art baselines. Results, analysis of ablation studies, and subjective evaluations from humans are also provided for a deeper understanding of the proposed method.
Anthology ID:
2023.findings-emnlp.990
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14847–14856
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.990
DOI:
10.18653/v1/2023.findings-emnlp.990
Bibkey:
Cite (ACL):
Huiyu Mai, Wenhao Jiang, and Zhi-Hong Deng. 2023. Prefix-Tuning Based Unsupervised Text Style Transfer. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 14847–14856, Singapore. Association for Computational Linguistics.
Cite (Informal):
Prefix-Tuning Based Unsupervised Text Style Transfer (Mai et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.990.pdf