Value type: the bridge to a better DST model

Gao Qixiang, Mingyang Sun, Yutao Mou, Chen Zeng, Weiran Xu


Abstract
Value type of the slots can provide lots of useful information for DST tasks. However, it has been ignored in most previous works. In this paper, we propose a new framework for DST task based on these value types. Firstly, we extract the type of token from each turn. Specifically, we divide the slots in the dataset into 9 categories according to the type of slot value, and then train a Ner model to extract the corresponding type-entity from each turn of conversation according to the token. Secondly, we improve the attention mode which is integrated into value type information between the slot and the conversation history to help each slot pay more attention to the turns that contain the same value type. Meanwhile, we introduce a sampling strategy to integrate these types into the attention formula, which decrease the error of Ner model. Finally, we conduct a comprehensive experiment on two multi-domain task-oriented conversation datasets, MultiWOZ 2.1 and MultiWOZ 2.4. The ablation experimental results show that our method is effective on both datasets, which verify the necessity of considering the type of slot value.
Anthology ID:
2023.findings-acl.78
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1211–1219
Language:
URL:
https://aclanthology.org/2023.findings-acl.78
DOI:
10.18653/v1/2023.findings-acl.78
Bibkey:
Cite (ACL):
Gao Qixiang, Mingyang Sun, Yutao Mou, Chen Zeng, and Weiran Xu. 2023. Value type: the bridge to a better DST model. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1211–1219, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Value type: the bridge to a better DST model (Qixiang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.78.pdf