Multi-task Learning for Natural Language Generation in Task-Oriented Dialogue

Chenguang Zhu, Michael Zeng, Xuedong Huang


Abstract
In task-oriented dialogues, Natural Language Generation (NLG) is the final yet crucial step to produce user-facing system utterances. The result of NLG is directly related to the perceived quality and usability of a dialogue system. While most existing systems provide semantically correct responses given goals to present, they struggle to match the variation and fluency in the human language. In this paper, we propose a novel multi-task learning framework, NLG-LM, for natural language generation. In addition to generating high-quality responses conveying the required information, it also explicitly targets for naturalness in generated responses via an unconditioned language model. This can significantly improve the learning of style and variation in human language. Empirical results show that this multi-task learning framework outperforms previous models across multiple datasets. For example, it improves the previous best BLEU score on the E2E-NLG dataset by 2.2%, and on the Laptop dataset by 6.1%.
Anthology ID:
D19-1123
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1261–1266
Language:
URL:
https://aclanthology.org/D19-1123
DOI:
10.18653/v1/D19-1123
Bibkey:
Cite (ACL):
Chenguang Zhu, Michael Zeng, and Xuedong Huang. 2019. Multi-task Learning for Natural Language Generation in Task-Oriented Dialogue. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 1261–1266, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Multi-task Learning for Natural Language Generation in Task-Oriented Dialogue (Zhu et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1123.pdf
Attachment:
 D19-1123.Attachment.pdf