%0 Conference Proceedings %T A Two-stage Model for Slot Filling in Low-resource Settings: Domain-agnostic Non-slot Reduction and Pretrained Contextual Embeddings %A Oguz, Cennet %A Vu, Ngoc Thang %Y Moosavi, Nafise Sadat %Y Fan, Angela %Y Shwartz, Vered %Y Glavaš, Goran %Y Joty, Shafiq %Y Wang, Alex %Y Wolf, Thomas %S Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing %D 2020 %8 November %I Association for Computational Linguistics %C Online %F oguz-vu-2020-two %X Learning-based slot filling - a key component of spoken language understanding systems - typically requires a large amount of in-domain hand-labeled data for training. In this paper, we propose a novel two-stage model architecture that can be trained with only a few in-domain hand-labeled examples. The first step is designed to remove non-slot tokens (i.e., O labeled tokens), as they introduce noise in the input of slot filling models. This step is domain-agnostic and therefore, can be trained by exploiting out-of-domain data. The second step identifies slot names only for slot tokens by using state-of-the-art pretrained contextual embeddings such as ELMO and BERT. We show that our approach outperforms other state-of-art systems on the SNIPS benchmark dataset. %R 10.18653/v1/2020.sustainlp-1.10 %U https://aclanthology.org/2020.sustainlp-1.10 %U https://doi.org/10.18653/v1/2020.sustainlp-1.10 %P 73-82