A Task-Oriented Dialogue Architecture via Transformer Neural Language Models and Symbolic Injection

Oscar J. Romero, Antian Wang, John Zimmerman, Aaron Steinfeld, Anthony Tomasic


Abstract
Recently, transformer language models have been applied to build both task- and non-task-oriented dialogue systems. Although transformers perform well on most of the NLP tasks, they perform poorly on context retrieval and symbolic reasoning. Our work aims to address this limitation by embedding the model in an operational loop that blends both natural language generation and symbolic injection. We evaluated our system on the multi-domain DSTC8 data set and reported joint goal accuracy of 75.8% (ranked among the first half positions), intent accuracy of 97.4% (which is higher than the reported literature), and a 15% improvement for success rate compared to a baseline with no symbolic injection. These promising results suggest that transformer language models can not only generate proper system responses but also symbolic representations that can further be used to enhance the overall quality of the dialogue management as well as serving as scaffolding for complex conversational reasoning.
Anthology ID:
2021.sigdial-1.46
Volume:
Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
July
Year:
2021
Address:
Singapore and Online
Editors:
Haizhou Li, Gina-Anne Levow, Zhou Yu, Chitralekha Gupta, Berrak Sisman, Siqi Cai, David Vandyke, Nina Dethlefs, Yan Wu, Junyi Jessy Li
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
438–444
Language:
URL:
https://aclanthology.org/2021.sigdial-1.46
DOI:
10.18653/v1/2021.sigdial-1.46
Bibkey:
Cite (ACL):
Oscar J. Romero, Antian Wang, John Zimmerman, Aaron Steinfeld, and Anthony Tomasic. 2021. A Task-Oriented Dialogue Architecture via Transformer Neural Language Models and Symbolic Injection. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 438–444, Singapore and Online. Association for Computational Linguistics.
Cite (Informal):
A Task-Oriented Dialogue Architecture via Transformer Neural Language Models and Symbolic Injection (Romero et al., SIGDIAL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.sigdial-1.46.pdf
Video:
 https://www.youtube.com/watch?v=mv-YwVGKhh8