Text Style Transferring via Adversarial Masking and Styled Filling

Jiarui Wang, Richong Zhang, Junfan Chen, Jaein Kim, Yongyi Mao


Abstract
Text style transfer is an important task in natural language processing with broad applications. Existing models following the masking and filling scheme suffer two challenges: the word masking procedure may mistakenly remove unexpected words and the selected words in the word filling procedure may lack diversity and semantic consistency. To tackle both challenges, in this study, we propose a style transfer model, with an adversarial masking approach and a styled filling technique (AMSF). Specifically, AMSF first trains a mask predictor by adversarial training without manual configuration. Then two additional losses, i.e. an entropy maximization loss and a consistency regularization loss, are introduced in training the word filling module to guarantee the diversity and semantic consistency of the transferred texts. Experimental results and analysis on two benchmark text style transfer data sets demonstrate the effectiveness of the proposed approaches.
Anthology ID:
2022.emnlp-main.521
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7654–7663
Language:
URL:
https://aclanthology.org/2022.emnlp-main.521
DOI:
10.18653/v1/2022.emnlp-main.521
Bibkey:
Cite (ACL):
Jiarui Wang, Richong Zhang, Junfan Chen, Jaein Kim, and Yongyi Mao. 2022. Text Style Transferring via Adversarial Masking and Styled Filling. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 7654–7663, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Text Style Transferring via Adversarial Masking and Styled Filling (Wang et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.521.pdf