Two Birds One Stone: Dynamic Ensemble for OOD Intent Classification

Yunhua Zhou, Jianqiang Yang, Pengyu Wang, Xipeng Qiu


Abstract
Out-of-domain (OOD) intent classification is an active field of natural language understanding, which is of great practical significance for intelligent devices such as the Task-Oriented Dialogue System. It mainly contains two challenges: it requires the model to know what it knows and what it does not know. This paper investigates “overthinking” in the open-world scenario and its impact on OOD intent classification. Inspired by this, we propose a two-birds-one-stone method, which allows the model to decide whether to make a decision on OOD classification early during inference and can ensure accuracy and accelerate inference. At the same time, to adapt to the behavior of dynamic inference, we also propose a training method based on ensemble methods. In addition to bringing certain theoretical insights, we also conduct detailed experiments on three real-world intent datasets. Compared with the previous baselines, our method can not only improve inference speed, but also achieve significant performance improvements.
Anthology ID:
2023.acl-long.595
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10659–10673
Language:
URL:
https://aclanthology.org/2023.acl-long.595
DOI:
10.18653/v1/2023.acl-long.595
Bibkey:
Cite (ACL):
Yunhua Zhou, Jianqiang Yang, Pengyu Wang, and Xipeng Qiu. 2023. Two Birds One Stone: Dynamic Ensemble for OOD Intent Classification. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10659–10673, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Two Birds One Stone: Dynamic Ensemble for OOD Intent Classification (Zhou et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.595.pdf
Video:
 https://aclanthology.org/2023.acl-long.595.mp4