Multi-Domain Dialogue State Tracking with Top-K Slot Self Attention

Longfei Yang, Jiyi Li, Sheng Li, Takahiro Shinozaki


Abstract
As an important component of task-oriented dialogue systems, dialogue state tracking is designed to track the dialogue state through the conversations between users and systems. Multi-domain dialogue state tracking is a challenging task, in which the correlation among different domains and slots needs to consider. Recently, slot self-attention is proposed to provide a data-driven manner to handle it. However, a full-support slot self-attention may involve redundant information interchange. In this paper, we propose a top-k attention-based slot self-attention for multi-domain dialogue state tracking. In the slot self-attention layers, we force each slot to involve information from the other k prominent slots and mask the rest out. The experimental results on two mainstream multi-domain task-oriented dialogue datasets, MultiWOZ 2.0 and MultiWOZ 2.4, present that our proposed approach is effective to improve the performance of multi-domain dialogue state tracking. We also find that the best result is obtained when each slot interchanges information with only a few slots.
Anthology ID:
2022.sigdial-1.24
Volume:
Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
September
Year:
2022
Address:
Edinburgh, UK
Editors:
Oliver Lemon, Dilek Hakkani-Tur, Junyi Jessy Li, Arash Ashrafzadeh, Daniel Hernández Garcia, Malihe Alikhani, David Vandyke, Ondřej Dušek
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
231–236
Language:
URL:
https://aclanthology.org/2022.sigdial-1.24
DOI:
10.18653/v1/2022.sigdial-1.24
Bibkey:
Cite (ACL):
Longfei Yang, Jiyi Li, Sheng Li, and Takahiro Shinozaki. 2022. Multi-Domain Dialogue State Tracking with Top-K Slot Self Attention. In Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 231–236, Edinburgh, UK. Association for Computational Linguistics.
Cite (Informal):
Multi-Domain Dialogue State Tracking with Top-K Slot Self Attention (Yang et al., SIGDIAL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.sigdial-1.24.pdf
Video:
 https://youtu.be/HOaC1W3yAow