Visualize Before You Write: Imagination-Guided Open-Ended Text Generation

Wanrong Zhu, An Yan, Yujie Lu, Wenda Xu, Xin Wang, Miguel Eckstein, William Yang Wang


Abstract
Recent advances in text-to-image synthesis make it possible to visualize machine imaginations for a given context. On the other hand, when generating text, human writers are gifted at creative visualization, which enhances their writings by forming imaginations as blueprints before putting down the stories in words. Inspired by such a cognitive process, we ask the natural question of whether we can endow machines with the same ability to utilize visual information and construct a general picture of the context to guide text generation. In this work, we propose iNLG that uses machine-generated images to guide language models (LM) in open-ended text generation. The experiments and analyses demonstrate the effectiveness of iNLG on open-ended text generation tasks, including text completion, story generation, and concept-to-text generation in both few-shot and full-data scenarios. Both automatic metrics and human evaluations verify that the text snippets generated by our iNLG are coherent and informative while displaying minor degeneration.
Anthology ID:
2023.findings-eacl.5
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
78–92
Language:
URL:
https://aclanthology.org/2023.findings-eacl.5
DOI:
10.18653/v1/2023.findings-eacl.5
Bibkey:
Cite (ACL):
Wanrong Zhu, An Yan, Yujie Lu, Wenda Xu, Xin Wang, Miguel Eckstein, and William Yang Wang. 2023. Visualize Before You Write: Imagination-Guided Open-Ended Text Generation. In Findings of the Association for Computational Linguistics: EACL 2023, pages 78–92, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Visualize Before You Write: Imagination-Guided Open-Ended Text Generation (Zhu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.5.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.5.mp4