Paraphrase Generation as Unsupervised Machine Translation

Xiaofei Sun, Yufei Tian, Yuxian Meng, Nanyun Peng, Fei Wu, Jiwei Li, Chun Fan


Abstract
In this paper, we propose a new paradigm for paraphrase generation by treating the task as unsupervised machine translation (UMT) based on the assumption that there must be pairs of sentences expressing the same meaning in a large-scale unlabeled monolingual corpus. The proposed paradigm first splits a large unlabeled corpus into multiple clusters, and trains multiple UMT models using pairs of these clusters. Then based on the paraphrase pairs produced by these UMT models, a unified surrogate model can be trained to serve as the final model to generate paraphrases, which can be directly used for test in the unsupervised setup, or be finetuned on labeled datasets in the supervised setup. The proposed method offers merits over machine-translation-based paraphrase generation methods, as it avoids reliance on bilingual sentence pairs. It also allows human intervene with the model so that more diverse paraphrases can be generated using different filtering criteria. Extensive experiments on existing paraphrase dataset for both the supervised and unsupervised setups demonstrate the effectiveness the proposed paradigm.
Anthology ID:
2022.coling-1.555
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6379–6391
Language:
URL:
https://aclanthology.org/2022.coling-1.555
DOI:
Bibkey:
Cite (ACL):
Xiaofei Sun, Yufei Tian, Yuxian Meng, Nanyun Peng, Fei Wu, Jiwei Li, and Chun Fan. 2022. Paraphrase Generation as Unsupervised Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 6379–6391, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Paraphrase Generation as Unsupervised Machine Translation (Sun et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.555.pdf
Data
COCO