%0 Conference Proceedings %T Neural NLG for Methodius: From RST Meaning Representations to Texts %A Stevens-Guille, Symon %A Maskharashvili, Aleksandre %A Isard, Amy %A Li, Xintong %A White, Michael %Y Davis, Brian %Y Graham, Yvette %Y Kelleher, John %Y Sripada, Yaji %S Proceedings of the 13th International Conference on Natural Language Generation %D 2020 %8 December %I Association for Computational Linguistics %C Dublin, Ireland %F stevens-guille-etal-2020-neural %X While classic NLG systems typically made use of hierarchically structured content plans that included discourse relations as central components, more recent neural approaches have mostly mapped simple, flat inputs to texts without representing discourse relations explicitly. In this paper, we investigate whether it is beneficial to include discourse relations in the input to neural data-to-text generators for texts where discourse relations play an important role. To do so, we reimplement the sentence planning and realization components of a classic NLG system, Methodius, using LSTM sequence-to-sequence (seq2seq) models. We find that although seq2seq models can learn to generate fluent and grammatical texts remarkably well with sufficiently representative Methodius training data, they cannot learn to correctly express Methodius’s similarity and contrast comparisons unless the corresponding RST relations are included in the inputs. Additionally, we experiment with using self-training and reverse model reranking to better handle train/test data mismatches, and find that while these methods help reduce content errors, it remains essential to include discourse relations in the input to obtain optimal performance. %R 10.18653/v1/2020.inlg-1.37 %U https://aclanthology.org/2020.inlg-1.37 %U https://doi.org/10.18653/v1/2020.inlg-1.37 %P 306-315