Neural NLG for Methodius: From RST Meaning Representations to Texts

Symon Stevens-Guille, Aleksandre Maskharashvili, Amy Isard, Xintong Li, Michael White


Abstract
While classic NLG systems typically made use of hierarchically structured content plans that included discourse relations as central components, more recent neural approaches have mostly mapped simple, flat inputs to texts without representing discourse relations explicitly. In this paper, we investigate whether it is beneficial to include discourse relations in the input to neural data-to-text generators for texts where discourse relations play an important role. To do so, we reimplement the sentence planning and realization components of a classic NLG system, Methodius, using LSTM sequence-to-sequence (seq2seq) models. We find that although seq2seq models can learn to generate fluent and grammatical texts remarkably well with sufficiently representative Methodius training data, they cannot learn to correctly express Methodius’s similarity and contrast comparisons unless the corresponding RST relations are included in the inputs. Additionally, we experiment with using self-training and reverse model reranking to better handle train/test data mismatches, and find that while these methods help reduce content errors, it remains essential to include discourse relations in the input to obtain optimal performance.
Anthology ID:
2020.inlg-1.37
Volume:
Proceedings of the 13th International Conference on Natural Language Generation
Month:
December
Year:
2020
Address:
Dublin, Ireland
Editors:
Brian Davis, Yvette Graham, John Kelleher, Yaji Sripada
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
306–315
Language:
URL:
https://aclanthology.org/2020.inlg-1.37
DOI:
10.18653/v1/2020.inlg-1.37
Bibkey:
Cite (ACL):
Symon Stevens-Guille, Aleksandre Maskharashvili, Amy Isard, Xintong Li, and Michael White. 2020. Neural NLG for Methodius: From RST Meaning Representations to Texts. In Proceedings of the 13th International Conference on Natural Language Generation, pages 306–315, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Neural NLG for Methodius: From RST Meaning Representations to Texts (Stevens-Guille et al., INLG 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.inlg-1.37.pdf
Supplementary attachment:
 2020.inlg-1.37.Supplementary_Attachment.pdf
Code
 methodius-project/neural-methodius
Data
WebNLG