UNO-DST: Leveraging Unlabelled Data in Zero-Shot Dialogue State Tracking

Chuang Li, Yan Zhang, Min-Yen Kan, Haizhou Li


Abstract
Previous zero-shot dialogue state tracking (DST) methods only apply transfer learning, but ignore unlabelled data in the target domain.We transform zero-shot DST into few-shot DST by utilising such unlabelled data via joint and self-training methods. Our method incorporates auxiliary tasks that generate slot types as inverse prompts for main tasks, creating slot values during joint training. Cycle consistency between these two tasks enables the generation and selection of quality samples in unknown target domains for subsequent fine-tuning. This approach also facilitates automatic label creation, thereby optimizing the training and fine-tuning of DST models. We demonstrate this method’s effectiveness on general language models in zero-shot scenarios, improving average joint goal accuracy by 8% across all domains in MultiWOZ.
Anthology ID:
2024.findings-naacl.187
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2972–2983
Language:
URL:
https://aclanthology.org/2024.findings-naacl.187
DOI:
Bibkey:
Cite (ACL):
Chuang Li, Yan Zhang, Min-Yen Kan, and Haizhou Li. 2024. UNO-DST: Leveraging Unlabelled Data in Zero-Shot Dialogue State Tracking. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 2972–2983, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
UNO-DST: Leveraging Unlabelled Data in Zero-Shot Dialogue State Tracking (Li et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.187.pdf
Copyright:
 2024.findings-naacl.187.copyright.pdf