Negative Lexically Constrained Decoding for Paraphrase Generation

Tomoyuki Kajiwara


Abstract
Paraphrase generation can be regarded as monolingual translation. Unlike bilingual machine translation, paraphrase generation rewrites only a limited portion of an input sentence. Hence, previous methods based on machine translation often perform conservatively to fail to make necessary rewrites. To solve this problem, we propose a neural model for paraphrase generation that first identifies words in the source sentence that should be paraphrased. Then, these words are paraphrased by the negative lexically constrained decoding that avoids outputting these words as they are. Experiments on text simplification and formality transfer show that our model improves the quality of paraphrasing by making necessary rewrites to an input sentence.
Anthology ID:
P19-1607
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6047–6052
Language:
URL:
https://aclanthology.org/P19-1607
DOI:
10.18653/v1/P19-1607
Bibkey:
Cite (ACL):
Tomoyuki Kajiwara. 2019. Negative Lexically Constrained Decoding for Paraphrase Generation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 6047–6052, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Negative Lexically Constrained Decoding for Paraphrase Generation (Kajiwara, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1607.pdf
Data
GYAFCNewsela