Turn-Level Active Learning for Dialogue State Tracking

Zihan Zhang, Meng Fang, Fanghua Ye, Ling Chen, Mohammad-Reza Namazi-Rad


Abstract
Dialogue state tracking (DST) plays an important role in task-oriented dialogue systems. However, collecting a large amount of turn-by-turn annotated dialogue data is costly and inefficient. In this paper, we propose a novel turn-level active learning framework for DST to actively select turns in dialogues to annotate. Given the limited labelling budget, experimental results demonstrate the effectiveness of selective annotation of dialogue turns. Additionally, our approach can effectively achieve comparable DST performance to traditional training approaches with significantly less annotated data, which provides a more efficient way to annotate new dialogue data.
Anthology ID:
2023.emnlp-main.478
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7705–7719
Language:
URL:
https://aclanthology.org/2023.emnlp-main.478
DOI:
10.18653/v1/2023.emnlp-main.478
Bibkey:
Cite (ACL):
Zihan Zhang, Meng Fang, Fanghua Ye, Ling Chen, and Mohammad-Reza Namazi-Rad. 2023. Turn-Level Active Learning for Dialogue State Tracking. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 7705–7719, Singapore. Association for Computational Linguistics.
Cite (Informal):
Turn-Level Active Learning for Dialogue State Tracking (Zhang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.478.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.478.mp4