Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking

Glorianna Jagfeld, Ngoc Thang Vu


Abstract
This paper presents our novel method to encode word confusion networks, which can represent a rich hypothesis space of automatic speech recognition systems, via recurrent neural networks. We demonstrate the utility of our approach for the task of dialog state tracking in spoken dialog systems that relies on automatic speech recognition output. Encoding confusion networks outperforms encoding the best hypothesis of the automatic speech recognition in a neural system for dialog state tracking on the well-known second Dialog State Tracking Challenge dataset.
Anthology ID:
W17-4602
Volume:
Proceedings of the Workshop on Speech-Centric Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Nicholas Ruiz, Srinivas Bangalore
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10–17
Language:
URL:
https://aclanthology.org/W17-4602
DOI:
10.18653/v1/W17-4602
Bibkey:
Cite (ACL):
Glorianna Jagfeld and Ngoc Thang Vu. 2017. Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking. In Proceedings of the Workshop on Speech-Centric Natural Language Processing, pages 10–17, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking (Jagfeld & Vu, 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-4602.pdf