Content Planning for Neural Story Generation with Aristotelian Rescoring

Seraphina Goldfarb-Tarrant, Tuhin Chakrabarty, Ralph Weischedel, Nanyun Peng


Abstract
Long-form narrative text generated from large language models manages a fluent impersonation of human writing, but only at the local sentence level, and lacks structure or global cohesion. We posit that many of the problems of story generation can be addressed via high-quality content planning, and present a system that focuses on how to learn good plot structures to guide story generation. We utilize a plot-generation language model along with an ensemble of rescoring models that each implement an aspect of good story-writing as detailed in Aristotle’s Poetics. We find that stories written with our more principled plot-structure are both more relevant to a given prompt and higher quality than baselines that do not content plan, or that plan in an unprincipled way.
Anthology ID:
2020.emnlp-main.351
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4319–4338
Language:
URL:
https://aclanthology.org/2020.emnlp-main.351
DOI:
10.18653/v1/2020.emnlp-main.351
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.351.pdf
Video:
 https://slideslive.com/38939240
Code
 PlusLabNLP/story-gen-BART
Data
ROCStoriesWritingPrompts