Exploring Metaphoric Paraphrase Generation

Kevin Stowe, Nils Beck, Iryna Gurevych


Abstract
Metaphor generation is a difficult task, and has seen tremendous improvement with the advent of deep pretrained models. We focus here on the specific task of metaphoric paraphrase generation, in which we provide a literal sentence and generate a metaphoric sentence which paraphrases that input. We compare naive, “free” generation models with those that exploit forms of control over the generation process, adding additional information based on conceptual metaphor theory. We evaluate two methods for generating paired training data, which is then used to train T5 models for free and controlled generation. We use crowdsourcing to evaluate the results, showing that free models tend to generate more fluent paraphrases, while controlled models are better at generating novel metaphors. We then analyze evaluation metrics, showing that different metrics are necessary to capture different aspects of metaphoric paraphrasing. We release our data and models, as well as our annotated results in order to facilitate development of better evaluation metrics.
Anthology ID:
2021.conll-1.26
Volume:
Proceedings of the 25th Conference on Computational Natural Language Learning
Month:
November
Year:
2021
Address:
Online
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
323–336
Language:
URL:
https://aclanthology.org/2021.conll-1.26
DOI:
10.18653/v1/2021.conll-1.26
Bibkey:
Cite (ACL):
Kevin Stowe, Nils Beck, and Iryna Gurevych. 2021. Exploring Metaphoric Paraphrase Generation. In Proceedings of the 25th Conference on Computational Natural Language Learning, pages 323–336, Online. Association for Computational Linguistics.
Cite (Informal):
Exploring Metaphoric Paraphrase Generation (Stowe et al., CoNLL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.conll-1.26.pdf
Video:
 https://aclanthology.org/2021.conll-1.26.mp4
Code
 ukplab/conll2021-metaphoric-paraphrase-generation