Generic resources are what you need: Style transfer tasks without task-specific parallel training data

Huiyuan Lai, Antonio Toral, Malvina Nissim


Abstract
Style transfer aims to rewrite a source text in a different target style while preserving its content. We propose a novel approach to this task that leverages generic resources, and without using any task-specific parallel (source–target) data outperforms existing unsupervised approaches on the two most popular style transfer tasks: formality transfer and polarity swap. In practice, we adopt a multi-step procedure which builds on a generic pre-trained sequence-to-sequence model (BART). First, we strengthen the model’s ability to rewrite by further pre-training BART on both an existing collection of generic paraphrases, as well as on synthetic pairs created using a general-purpose lexical resource. Second, through an iterative back-translation approach, we train two models, each in a transfer direction, so that they can provide each other with synthetically generated pairs, dynamically in the training process. Lastly, we let our best resulting model generate static synthetic pairs to be used in a supervised training regime. Besides methodology and state-of-the-art results, a core contribution of this work is a reflection on the nature of the two tasks we address, and how their differences are highlighted by their response to our approach.
Anthology ID:
2021.emnlp-main.349
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4241–4254
Language:
URL:
https://aclanthology.org/2021.emnlp-main.349
DOI:
10.18653/v1/2021.emnlp-main.349
Bibkey:
Cite (ACL):
Huiyuan Lai, Antonio Toral, and Malvina Nissim. 2021. Generic resources are what you need: Style transfer tasks without task-specific parallel training data. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4241–4254, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Generic resources are what you need: Style transfer tasks without task-specific parallel training data (Lai et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.349.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.349.mp4
Code
 laihuiyuan/generic-resources-for-tst
Data
GYAFC