Dialogue State Tracking with Sparse Local Slot Attention

Longfei Yang, Jiyi Li, Sheng Li, Takahiro Shinozaki


Abstract
Dialogue state tracking (DST) is designed to track the dialogue state during the conversations between users and systems, which is the core of task-oriented dialogue systems. Mainstream models predict the values for each slot with fully token-wise slot attention from dialogue history. However, such operations may result in overlooking the neighboring relationship. Moreover, it may lead the model to assign probability mass to irrelevant parts, while these parts contribute little. It becomes severe with the increase in dialogue length. Therefore, we investigate sparse local slot attention for DST in this work. Slot-specific local semantic information is obtained at a sub-sampled temporal resolution capturing local dependencies for each slot. Then these local representations are attended with sparse attention weights to guide the model to pay attention to relevant parts of local information for subsequent state value prediction. The experimental results on MultiWOZ 2.0 and 2.4 datasets show that the proposed approach effectively improves the performance of ontology-based dialogue state tracking, and performs better than token-wise attention for long dialogues.
Anthology ID:
2023.nlp4convai-1.4
Volume:
Proceedings of the 5th Workshop on NLP for Conversational AI (NLP4ConvAI 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Yun-Nung Chen, Abhinav Rastogi
Venue:
NLP4ConvAI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
39–46
Language:
URL:
https://aclanthology.org/2023.nlp4convai-1.4
DOI:
10.18653/v1/2023.nlp4convai-1.4
Bibkey:
Cite (ACL):
Longfei Yang, Jiyi Li, Sheng Li, and Takahiro Shinozaki. 2023. Dialogue State Tracking with Sparse Local Slot Attention. In Proceedings of the 5th Workshop on NLP for Conversational AI (NLP4ConvAI 2023), pages 39–46, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Dialogue State Tracking with Sparse Local Slot Attention (Yang et al., NLP4ConvAI 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.nlp4convai-1.4.pdf