Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection

Rheeya Uppaal, Junjie Hu, Yixuan Li


Abstract
Out-of-distribution (OOD) detection is a critical task for reliable predictions over text. Fine-tuning with pre-trained language models has been a de facto procedure to derive OOD detectors with respect to in-distribution (ID) data. Despite its common use, the understanding of the role of fine-tuning and its necessity for OOD detection is largely unexplored. In this paper, we raise the question: is fine-tuning necessary for OOD detection? We present a study investigating the efficacy of directly leveraging pre-trained language models for OOD detection, without any model fine-tuning on the ID data. We compare the approach with several competitive fine-tuning objectives, and offer new insights under various types of distributional shifts. Extensive experiments demonstrate near-perfect OOD detection performance (with 0% FPR95 in many cases), strongly outperforming the fine-tuned counterpart.
Anthology ID:
2023.acl-long.717
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12813–12832
Language:
URL:
https://aclanthology.org/2023.acl-long.717
DOI:
10.18653/v1/2023.acl-long.717
Bibkey:
Cite (ACL):
Rheeya Uppaal, Junjie Hu, and Yixuan Li. 2023. Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 12813–12832, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection (Uppaal et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.717.pdf
Video:
 https://aclanthology.org/2023.acl-long.717.mp4