VAE based Text Style Transfer with Pivot Words Enhancement Learning

Haoran Xu, Sixing Lu, Zhongkai Sun, Chengyuan Ma, Chenlei Guo


Abstract
Text Style Transfer (TST) aims to alter the underlying style of the source text to another specific style while keeping the same content. Due to the scarcity of high-quality parallel training data, unsupervised learning has become a trending direction for TST tasks. In this paper, we propose a novel VAE based Text Style Transfer with pivOt Words Enhancement leaRning (VT-STOWER) method which utilizes Variational AutoEncoder (VAE) and external style embeddings to learn semantics and style distribution jointly. Additionally, we introduce pivot words learning, which is applied to learn decisive words for a specific style and thereby further improve the overall performance of the style transfer. The proposed VT-STOWER can be scaled to different TST scenarios given very limited and non-parallel training data with a novel and flexible style strength control mechanism. Experiments demonstrate that the VT-STOWER outperforms the state-of-the-art on sentiment, formality, and code-switching TST tasks.
Anthology ID:
2021.icon-main.20
Volume:
Proceedings of the 18th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2021
Address:
National Institute of Technology Silchar, Silchar, India
Editors:
Sivaji Bandyopadhyay, Sobha Lalitha Devi, Pushpak Bhattacharyya
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
162–172
Language:
URL:
https://aclanthology.org/2021.icon-main.20
DOI:
Bibkey:
Cite (ACL):
Haoran Xu, Sixing Lu, Zhongkai Sun, Chengyuan Ma, and Chenlei Guo. 2021. VAE based Text Style Transfer with Pivot Words Enhancement Learning. In Proceedings of the 18th International Conference on Natural Language Processing (ICON), pages 162–172, National Institute of Technology Silchar, Silchar, India. NLP Association of India (NLPAI).
Cite (Informal):
VAE based Text Style Transfer with Pivot Words Enhancement Learning (Xu et al., ICON 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.icon-main.20.pdf
Code
 fe1ixxu/vt-stower
Data
GYAFCLinCE