EtriCA: Event-Triggered Context-Aware Story Generation Augmented by Cross Attention

Chen Tang, Chenghua Lin, Henglin Huang, Frank Guerin, Zhihao Zhang


Abstract
One of the key challenges of automatic story generation is how to generate a long narrative that can maintain fluency, relevance, and coherence. Despite recent progress, current story generation systems still face the challenge of how to effectively capture contextual and event features, which has a profound impact on a model’s generation performance. To address these challenges, we present EtriCA, a novel neural generation model, which improves the relevance and coherence of the generated stories through residually mapping context features to event sequences with a cross-attention mechanism. Such a feature capturing mechanism allows our model to better exploit the logical relatedness between events when generating stories. Extensive experiments based on both automatic and human evaluations show that our model significantly outperforms state-of-the-art baselines, demonstrating the effectiveness of our model in leveraging context and event features.
Anthology ID:
2022.findings-emnlp.403
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5504–5518
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.403
DOI:
10.18653/v1/2022.findings-emnlp.403
Bibkey:
Cite (ACL):
Chen Tang, Chenghua Lin, Henglin Huang, Frank Guerin, and Zhihao Zhang. 2022. EtriCA: Event-Triggered Context-Aware Story Generation Augmented by Cross Attention. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 5504–5518, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
EtriCA: Event-Triggered Context-Aware Story Generation Augmented by Cross Attention (Tang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.403.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.403.mp4