Pseudo Outlier Exposure for Out-of-Distribution Detection using Pretrained Transformers

Jaeyoung Kim, Kyuheon Jung, Dongbin Na, Sion Jang, Eunbin Park, Sungchul Choi


Abstract
For real-world language applications, detecting an out-of-distribution (OOD) sample is helpful to alert users or reject such unreliable samples. However, modern over-parameterized language models often produce overconfident predictions for both in-distribution (ID) and OOD samples. In particular, language models suffer from OOD samples with a similar semantic representation to ID samples since these OOD samples lie near the ID manifold.A rejection network can be trained with ID and diverse outlier samples to detect test OOD samples, but explicitly collecting auxiliary OOD datasets brings an additional burden for data collection. In this paper, we propose a simple but effective method called Pseudo Outlier Exposure (POE) that constructs a surrogate OOD dataset by sequentially masking tokens related to ID classes. The surrogate OOD sample introduced by POE shows a similar representation to ID data, which is most effective in training a rejection network. Our method does not require any external OOD data and can be easily implemented within off-the-shelf Transformers.A comprehensive comparison with state-of-the-art algorithms demonstrates POE’s competitiveness on several text classification benchmarks.
Anthology ID:
2023.findings-acl.95
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1469–1482
Language:
URL:
https://aclanthology.org/2023.findings-acl.95
DOI:
10.18653/v1/2023.findings-acl.95
Bibkey:
Cite (ACL):
Jaeyoung Kim, Kyuheon Jung, Dongbin Na, Sion Jang, Eunbin Park, and Sungchul Choi. 2023. Pseudo Outlier Exposure for Out-of-Distribution Detection using Pretrained Transformers. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1469–1482, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Pseudo Outlier Exposure for Out-of-Distribution Detection using Pretrained Transformers (Kim et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.95.pdf
Video:
 https://aclanthology.org/2023.findings-acl.95.mp4