Can Neural Generators for Dialogue Learn Sentence Planning and Discourse Structuring?

Lena Reed, Shereen Oraby, Marilyn Walker


Abstract
Responses in task-oriented dialogue systems often realize multiple propositions whose ultimate form depends on the use of sentence planning and discourse structuring operations. For example a recommendation may consist of an explicitly evaluative utterance e.g. Chanpen Thai is the best option, along with content related by the justification discourse relation, e.g. It has great food and service, that combines multiple propositions into a single phrase. While neural generation methods integrate sentence planning and surface realization in one end-to-end learning framework, previous work has not shown that neural generators can: (1) perform common sentence planning and discourse structuring operations; (2) make decisions as to whether to realize content in a single sentence or over multiple sentences; (3) generalize sentence planning and discourse relation operations beyond what was seen in training. We systematically create large training corpora that exhibit particular sentence planning operations and then test neural models to see what they learn. We compare models without explicit latent variables for sentence planning with ones that provide explicit supervision during training. We show that only the models with additional supervision can reproduce sentence planning and discourse operations and generalize to situations unseen in training.
Anthology ID:
W18-6535
Volume:
Proceedings of the 11th International Conference on Natural Language Generation
Month:
November
Year:
2018
Address:
Tilburg University, The Netherlands
Editors:
Emiel Krahmer, Albert Gatt, Martijn Goudbeek
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
284–295
Language:
URL:
https://aclanthology.org/W18-6535
DOI:
10.18653/v1/W18-6535
Bibkey:
Cite (ACL):
Lena Reed, Shereen Oraby, and Marilyn Walker. 2018. Can Neural Generators for Dialogue Learn Sentence Planning and Discourse Structuring?. In Proceedings of the 11th International Conference on Natural Language Generation, pages 284–295, Tilburg University, The Netherlands. Association for Computational Linguistics.
Cite (Informal):
Can Neural Generators for Dialogue Learn Sentence Planning and Discourse Structuring? (Reed et al., INLG 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-6535.pdf