Event Transition Planning for Open-ended Text Generation

Qintong Li, Piji Li, Wei Bi, Zhaochun Ren, Yuxuan Lai, Lingpeng Kong


Abstract
Open-ended text generation tasks, such as dialogue generation and story completion, require models to generate a coherent continuation given limited preceding context. The open-ended nature of these tasks brings new challenges to the neural auto-regressive text generators nowadays. Despite these neural models are good at producing human-like text, it is difficult for them to arrange causalities and relations between given facts and possible ensuing events. To bridge this gap, we propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation. Our approach can be understood as a specially-trained coarse-to-fine algorithm, where an event transition planner provides a “coarse” plot skeleton and a text generator in the second stage refines the skeleton. Experiments on two open-ended text generation tasks demonstrate that our proposed method effectively improves the quality of the generated text, especially in coherence and diversity. We will release the codes to the community for further exploration.
Anthology ID:
2022.findings-acl.269
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3412–3426
Language:
URL:
https://aclanthology.org/2022.findings-acl.269
DOI:
10.18653/v1/2022.findings-acl.269
Bibkey:
Cite (ACL):
Qintong Li, Piji Li, Wei Bi, Zhaochun Ren, Yuxuan Lai, and Lingpeng Kong. 2022. Event Transition Planning for Open-ended Text Generation. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3412–3426, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Event Transition Planning for Open-ended Text Generation (Li et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.269.pdf
Data
ATOMIC