Distribution Calibration for Out-of-Domain Detection with Bayesian Approximation

Yanan Wu, Zhiyuan Zeng, Keqing He, Yutao Mou, Pei Wang, Weiran Xu


Abstract
Out-of-Domain (OOD) detection is a key component in a task-oriented dialog system, which aims to identify whether a query falls outside the predefined supported intent set. Previous softmax-based detection algorithms are proved to be overconfident for OOD samples. In this paper, we analyze overconfident OOD comes from distribution uncertainty due to the mismatch between the training and test distributions, which makes the model can’t confidently make predictions thus probably causes abnormal softmax scores. We propose a Bayesian OOD detection framework to calibrate distribution uncertainty using Monte-Carlo Dropout. Our method is flexible and easily pluggable to existing softmax-based baselines and gains 33.33% OOD F1 improvements with increasing only 0.41% inference time compared to MSP. Further analyses show the effectiveness of Bayesian learning for OOD detection.
Anthology ID:
2022.coling-1.50
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
608–615
Language:
URL:
https://aclanthology.org/2022.coling-1.50
DOI:
Bibkey:
Cite (ACL):
Yanan Wu, Zhiyuan Zeng, Keqing He, Yutao Mou, Pei Wang, and Weiran Xu. 2022. Distribution Calibration for Out-of-Domain Detection with Bayesian Approximation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 608–615, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Distribution Calibration for Out-of-Domain Detection with Bayesian Approximation (Wu et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.50.pdf
Code
 pris-nlp/coling2022_bayesian-for-ood