FLatS: Principled Out-of-Distribution Detection with Feature-Based Likelihood Ratio Score

Haowei Lin, Yuntian Gu


Abstract
Detecting out-of-distribution (OOD) instances is crucial for NLP models in practical applications. Although numerous OOD detection methods exist, most of them are empirical. Backed by theoretical analysis, this paper advocates for the measurement of the “OOD-ness” of a test case x through the likelihood ratio between out-distribution \mathcal Pout and in-distribution \mathcal Pin. We argue that the state-of-the-art (SOTA) feature-based OOD detection methods, such as Maha and KNN, are suboptimal since they only estimate in-distribution density pin(x). To address this issue, we propose FLATS, a principled solution for OOD detection based on likelihood ratio. Moreover, we demonstrate that FLATS can serve as a general framework capable of enhancing other OOD detection methods by incorporating out-distribution density pout(x) estimation. Experiments show that FLATS establishes a new SOTA on popular benchmarks.
Anthology ID:
2023.emnlp-main.554
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8956–8963
Language:
URL:
https://aclanthology.org/2023.emnlp-main.554
DOI:
10.18653/v1/2023.emnlp-main.554
Bibkey:
Cite (ACL):
Haowei Lin and Yuntian Gu. 2023. FLatS: Principled Out-of-Distribution Detection with Feature-Based Likelihood Ratio Score. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 8956–8963, Singapore. Association for Computational Linguistics.
Cite (Informal):
FLatS: Principled Out-of-Distribution Detection with Feature-Based Likelihood Ratio Score (Lin & Gu, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.554.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.554.mp4