%0 Conference Proceedings %T Improving Slot Filling by Utilizing Contextual Information %A Pouran Ben Veyseh, Amir %A Dernoncourt, Franck %A Nguyen, Thien Huu %Y Wen, Tsung-Hsien %Y Celikyilmaz, Asli %Y Yu, Zhou %Y Papangelis, Alexandros %Y Eric, Mihail %Y Kumar, Anuj %Y Casanueva, Iñigo %Y Shah, Rushin %S Proceedings of the 2nd Workshop on Natural Language Processing for Conversational AI %D 2020 %8 July %I Association for Computational Linguistics %C Online %F pouran-ben-veyseh-etal-2020-improving %X Slot Filling (SF) is one of the sub-tasks of Spoken Language Understanding (SLU) which aims to extract semantic constituents from a given natural language utterance. It is formulated as a sequence labeling task. Recently, it has been shown that contextual information is vital for this task. However, existing models employ contextual information in a restricted manner, e.g., using self-attention. Such methods fail to distinguish the effects of the context on the word representation and the word label. To address this issue, in this paper, we propose a novel method to incorporate the contextual information in two different levels, i.e., representation level and task-specific (i.e., label) level. Our extensive experiments on three benchmark datasets on SF show the effectiveness of our model leading to new state-of-the-art results on all three benchmark datasets for the task of SF. %R 10.18653/v1/2020.nlp4convai-1.11 %U https://aclanthology.org/2020.nlp4convai-1.11 %U https://doi.org/10.18653/v1/2020.nlp4convai-1.11 %P 90-95