Returning to the Start: Generating Narratives with Related Endpoints

Anneliese Brei, Chao Zhao, Snigdha Chaturvedi


Abstract
Human writers often *bookend* their writing with ending sentences that relate back to the beginning sentences in order to compose a satisfying narrative that “closes the loop.” Motivated by this observation, we propose RENarGen, a controllable story-generation paradigm that generates narratives by ensuring the first and last sentences are related and then infilling the middle sentences. Our contributions include an initial exploration of how various methods of bookending from Narratology affect language modeling for stories. Automatic and human evaluations indicate RENarGen produces better stories with more narrative closure than current autoregressive models.
Anthology ID:
2024.naacl-short.10
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
101–112
Language:
URL:
https://aclanthology.org/2024.naacl-short.10
DOI:
10.18653/v1/2024.naacl-short.10
Bibkey:
Cite (ACL):
Anneliese Brei, Chao Zhao, and Snigdha Chaturvedi. 2024. Returning to the Start: Generating Narratives with Related Endpoints. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 101–112, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Returning to the Start: Generating Narratives with Related Endpoints (Brei et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-short.10.pdf