Context-Driven Satirical News Generation

Zachary Horvitz, Nam Do, Michael L. Littman


Abstract
While mysterious, humor likely hinges on an interplay of entities, their relationships, and cultural connotations. Motivated by the importance of context in humor, we consider methods for constructing and leveraging contextual representations in generating humorous text. Specifically, we study the capacity of transformer-based architectures to generate funny satirical headlines, and show that both language models and summarization models can be fine-tuned to regularly generate headlines that people find funny. Furthermore, we find that summarization models uniquely support satire-generation by enabling the generation of topical humorous text. Outside of our formal study, we note that headlines generated by our model were accepted via a competitive process into a satirical newspaper, and one headline was ranked as high or better than 73% of human submissions. As part of our work, we contribute a dataset of over 15K satirical headlines paired with ranked contextual information from news articles and Wikipedia.
Anthology ID:
2020.figlang-1.5
Volume:
Proceedings of the Second Workshop on Figurative Language Processing
Month:
July
Year:
2020
Address:
Online
Editors:
Beata Beigman Klebanov, Ekaterina Shutova, Patricia Lichtenstein, Smaranda Muresan, Chee Wee, Anna Feldman, Debanjan Ghosh
Venue:
Fig-Lang
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
40–50
Language:
URL:
https://aclanthology.org/2020.figlang-1.5
DOI:
10.18653/v1/2020.figlang-1.5
Bibkey:
Cite (ACL):
Zachary Horvitz, Nam Do, and Michael L. Littman. 2020. Context-Driven Satirical News Generation. In Proceedings of the Second Workshop on Figurative Language Processing, pages 40–50, Online. Association for Computational Linguistics.
Cite (Informal):
Context-Driven Satirical News Generation (Horvitz et al., Fig-Lang 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.figlang-1.5.pdf
Video:
 http://slideslive.com/38929710