Plan-then-Generate: Controlled Data-to-Text Generation via Planning

Yixuan Su, David Vandyke, Sihui Wang, Yimai Fang, Nigel Collier


Abstract
Recent developments in neural networks have led to the advance in data-to-text generation. However, the lack of ability of neural models to control the structure of generated output can be limiting in certain real-world applications. In this study, we propose a novel Plan-then-Generate (PlanGen) framework to improve the controllability of neural data-to-text models. Extensive experiments and analyses are conducted on two benchmark datasets, ToTTo and WebNLG. The results show that our model is able to control both the intra-sentence and inter-sentence structure of the generated output. Furthermore, empirical comparisons against previous state-of-the-art methods show that our model improves the generation quality as well as the output diversity as judged by human and automatic evaluations.
Anthology ID:
2021.findings-emnlp.76
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
895–909
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.76
DOI:
10.18653/v1/2021.findings-emnlp.76
Bibkey:
Cite (ACL):
Yixuan Su, David Vandyke, Sihui Wang, Yimai Fang, and Nigel Collier. 2021. Plan-then-Generate: Controlled Data-to-Text Generation via Planning. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 895–909, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Plan-then-Generate: Controlled Data-to-Text Generation via Planning (Su et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.76.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.76.mp4
Code
 google-research-datasets/ToTTo +  additional community code