Flexibly-Structured Model for Task-Oriented Dialogues

Lei Shu, Piero Molino, Mahdi Namazifar, Hu Xu, Bing Liu, Huaixiu Zheng, Gokhan Tur


Abstract
This paper proposes a novel end-to-end architecture for task-oriented dialogue systems. It is based on a simple and practical yet very effective sequence-to-sequence approach, where language understanding and state tracking tasks are modeled jointly with a structured copy-augmented sequential decoder and a multi-label decoder for each slot. The policy engine and language generation tasks are modeled jointly following that. The copy-augmented sequential decoder deals with new or unknown values in the conversation, while the multi-label decoder combined with the sequential decoder ensures the explicit assignment of values to slots. On the generation part, slot binary classifiers are used to improve performance. This architecture is scalable to real-world scenarios and is shown through an empirical evaluation to achieve state-of-the-art performance on both the Cambridge Restaurant dataset and the Stanford in-car assistant dataset.
Anthology ID:
W19-5922
Volume:
Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue
Month:
September
Year:
2019
Address:
Stockholm, Sweden
Editors:
Satoshi Nakamura, Milica Gasic, Ingrid Zukerman, Gabriel Skantze, Mikio Nakano, Alexandros Papangelis, Stefan Ultes, Koichiro Yoshino
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
178–187
Language:
URL:
https://aclanthology.org/W19-5922
DOI:
10.18653/v1/W19-5922
Bibkey:
Cite (ACL):
Lei Shu, Piero Molino, Mahdi Namazifar, Hu Xu, Bing Liu, Huaixiu Zheng, and Gokhan Tur. 2019. Flexibly-Structured Model for Task-Oriented Dialogues. In Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue, pages 178–187, Stockholm, Sweden. Association for Computational Linguistics.
Cite (Informal):
Flexibly-Structured Model for Task-Oriented Dialogues (Shu et al., SIGDIAL 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-5922.pdf
Code
 uber-research/FSDM