SoDA: On-device Conversational Slot Extraction

Sujith Ravi, Zornitsa Kozareva


Abstract
We propose a novel on-device neural sequence labeling model which uses embedding-free projections and character information to construct compact word representations to learn a sequence model using a combination of bidirectional LSTM with self-attention and CRF. Unlike typical dialog models that rely on huge, complex neural network architectures and large-scale pre-trained Transformers to achieve state-of-the-art results, our method achieves comparable results to BERT and even outperforms its smaller variant DistilBERT on conversational slot extraction tasks. Our method is faster than BERT models while achieving significant model size reduction–our model requires 135x and 81x fewer model parameters than BERT and DistilBERT, respectively. We conduct experiments on multiple conversational datasets and show significant improvements over existing methods including recent on-device models. Experimental results and ablation studies also show that our neural models preserve tiny memory footprint necessary to operate on smart devices, while still maintaining high performance.
Anthology ID:
2021.sigdial-1.7
Volume:
Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
July
Year:
2021
Address:
Singapore and Online
Editors:
Haizhou Li, Gina-Anne Levow, Zhou Yu, Chitralekha Gupta, Berrak Sisman, Siqi Cai, David Vandyke, Nina Dethlefs, Yan Wu, Junyi Jessy Li
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
56–65
Language:
URL:
https://aclanthology.org/2021.sigdial-1.7
DOI:
10.18653/v1/2021.sigdial-1.7
Bibkey:
Cite (ACL):
Sujith Ravi and Zornitsa Kozareva. 2021. SoDA: On-device Conversational Slot Extraction. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 56–65, Singapore and Online. Association for Computational Linguistics.
Cite (Informal):
SoDA: On-device Conversational Slot Extraction (Ravi & Kozareva, SIGDIAL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.sigdial-1.7.pdf
Video:
 https://www.youtube.com/watch?v=0hDaafkctwI