On Prefix-tuning for Lightweight Out-of-distribution Detection

Yawen Ouyang, Yongchang Cao, Yuan Gao, Zhen Wu, Jianbing Zhang, Xinyu Dai


Abstract
Out-of-distribution (OOD) detection, a fundamental task vexing real-world applications, has attracted growing attention in the NLP community. Recently fine-tuning based methods have made promising progress. However, it could be costly to store fine-tuned models for each scenario. In this paper, we depart from the classic fine-tuning based OOD detection toward a parameter-efficient alternative, and propose an unsupervised prefix-tuning based OOD detection framework termed PTO. Additionally, to take advantage of optional training data labels and targeted OOD data, two practical extensions of PTO are further proposed. Overall, PTO and its extensions offer several key advantages of being lightweight, easy-to-reproduce, and theoretically justified. Experimental results show that our methods perform comparably to, even better than, existing fine-tuning based OOD detection approaches under a wide range of metrics, detection settings, and OOD types.
Anthology ID:
2023.acl-long.85
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1533–1545
Language:
URL:
https://aclanthology.org/2023.acl-long.85
DOI:
10.18653/v1/2023.acl-long.85
Bibkey:
Cite (ACL):
Yawen Ouyang, Yongchang Cao, Yuan Gao, Zhen Wu, Jianbing Zhang, and Xinyu Dai. 2023. On Prefix-tuning for Lightweight Out-of-distribution Detection. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1533–1545, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
On Prefix-tuning for Lightweight Out-of-distribution Detection (Ouyang et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.85.pdf
Video:
 https://aclanthology.org/2023.acl-long.85.mp4