Reduction-Synthesis: Plug-and-Play for Sentiment Style Transfer

Sheng Xu, Fumiyo Fukumoto, Yoshimi Suzuki


Abstract
Sentiment style transfer (SST), a variant of text style transfer (TST), has recently attracted extensive interest. Some disentangling-based approaches have improved performance, while most still struggle to properly transfer the input as the sentiment style is intertwined with the content of the text. To alleviate the issue, we propose a plug-and-play method that leverages an iterative self-refinement algorithm with a large language model (LLM). Our approach separates the straightforward Seq2Seq generation into two phases: (1) Reduction phase which generates a style-free sequence for a given text, and (2) Synthesis phase which generates the target text by leveraging the sequence output from the first phase. The experimental results on two datasets demonstrate that our transfer strategy is effective for challenging SST cases where the baseline methods perform poorly. Our code is available online.
Anthology ID:
2024.inlg-main.28
Volume:
Proceedings of the 17th International Natural Language Generation Conference
Month:
September
Year:
2024
Address:
Tokyo, Japan
Editors:
Saad Mahamood, Nguyen Le Minh, Daphne Ippolito
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
330–343
Language:
URL:
https://aclanthology.org/2024.inlg-main.28
DOI:
Bibkey:
Cite (ACL):
Sheng Xu, Fumiyo Fukumoto, and Yoshimi Suzuki. 2024. Reduction-Synthesis: Plug-and-Play for Sentiment Style Transfer. In Proceedings of the 17th International Natural Language Generation Conference, pages 330–343, Tokyo, Japan. Association for Computational Linguistics.
Cite (Informal):
Reduction-Synthesis: Plug-and-Play for Sentiment Style Transfer (Xu et al., INLG 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.inlg-main.28.pdf
Supplementary attachment:
 2024.inlg-main.28.Supplementary_Attachment.zip