%0 Journal Article %T Data-to-text Generation with Variational Sequential Planning %A Puduppully, Ratish %A Fu, Yao %A Lapata, Mirella %J Transactions of the Association for Computational Linguistics %D 2022 %V 10 %I MIT Press %C Cambridge, MA %F puduppully-etal-2022-data %X We consider the task of data-to-text generation, which aims to create textual output from non-linguistic input. We focus on generating long-form text, that is, documents with multiple paragraphs, and propose a neural model enhanced with a planning component responsible for organizing high-level information in a coherent and meaningful way. We infer latent plans sequentially with a structured variational model, while interleaving the steps of planning and generation. Text is generated by conditioning on previous variational decisions and previously generated text. Experiments on two data-to-text benchmarks (RotoWire and MLB) show that our model outperforms strong baselines and is sample-efficient in the face of limited training data (e.g., a few hundred instances). %R 10.1162/tacl_a_00484 %U https://aclanthology.org/2022.tacl-1.40 %U https://doi.org/10.1162/tacl_a_00484 %P 697-715