Slot Tagging for Task Oriented Spoken Language Understanding in Human-to-Human Conversation Scenarios

Kunho Kim, Rahul Jha, Kyle Williams, Alex Marin, Imed Zitouni


Abstract
Task oriented language understanding (LU) in human-to-machine (H2M) conversations has been extensively studied for personal digital assistants. In this work, we extend the task oriented LU problem to human-to-human (H2H) conversations, focusing on the slot tagging task. Recent advances on LU in H2M conversations have shown accuracy improvements by adding encoded knowledge from different sources. Inspired by this, we explore several variants of a bidirectional LSTM architecture that relies on different knowledge sources, such as Web data, search engine click logs, expert feedback from H2M models, as well as previous utterances in the conversation. We also propose ensemble techniques that aggregate these different knowledge sources into a single model. Experimental evaluation on a four-turn Twitter dataset in the restaurant and music domains shows improvements in the slot tagging F1-score of up to 6.09% compared to existing approaches.
Anthology ID:
K19-1071
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
757–767
Language:
URL:
https://aclanthology.org/K19-1071
DOI:
10.18653/v1/K19-1071
Bibkey:
Cite (ACL):
Kunho Kim, Rahul Jha, Kyle Williams, Alex Marin, and Imed Zitouni. 2019. Slot Tagging for Task Oriented Spoken Language Understanding in Human-to-Human Conversation Scenarios. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 757–767, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Slot Tagging for Task Oriented Spoken Language Understanding in Human-to-Human Conversation Scenarios (Kim et al., CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1071.pdf