CM-Gen: A Neural Framework for Chinese Metaphor Generation with Explicit Context Modelling

Yucheng Li, Chenghua Lin, Frank Guerin


Abstract
Nominal metaphors are frequently used in human language and have been shown to be effective in persuading, expressing emotion, and stimulating interest. This paper tackles the problem of Chinese Nominal Metaphor (NM) generation. We introduce a novel multitask framework, which jointly optimizes three tasks: NM identification, NM component identification, and NM generation. The metaphor identification module is able to perform a self-training procedure, which discovers novel metaphors from a large-scale unlabeled corpus for NM generation. The NM component identification module emphasizes components during training and conditions the generation on these NM components for more coherent results. To train the NM identification and component identification modules, we construct an annotated corpus consisting of 6.3k sentences that contain diverse metaphorical patterns. Automatic metrics show that our method can produce diverse metaphors with good readability, where 92% of them are novel metaphorical comparisons. Human evaluation shows our model significantly outperforms baselines on consistency and creativity.
Anthology ID:
2022.coling-1.563
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6468–6479
Language:
URL:
https://aclanthology.org/2022.coling-1.563
DOI:
Bibkey:
Cite (ACL):
Yucheng Li, Chenghua Lin, and Frank Guerin. 2022. CM-Gen: A Neural Framework for Chinese Metaphor Generation with Explicit Context Modelling. In Proceedings of the 29th International Conference on Computational Linguistics, pages 6468–6479, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
CM-Gen: A Neural Framework for Chinese Metaphor Generation with Explicit Context Modelling (Li et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.563.pdf