Contrastive Out-of-Distribution Detection for Pretrained Transformers

Wenxuan Zhou, Fangyu Liu, Muhao Chen


Abstract
Pretrained Transformers achieve remarkable performance when training and test data are from the same distribution. However, in real-world scenarios, the model often faces out-of-distribution (OOD) instances that can cause severe semantic shift problems at inference time. Therefore, in practice, a reliable model should identify such instances, and then either reject them during inference or pass them over to models that handle another distribution. In this paper, we develop an unsupervised OOD detection method, in which only the in-distribution (ID) data are used in training. We propose to fine-tune the Transformers with a contrastive loss, which improves the compactness of representations, such that OOD instances can be better differentiated from ID ones. These OOD instances can then be accurately detected using the Mahalanobis distance in the model’s penultimate layer. We experiment with comprehensive settings and achieve near-perfect OOD detection performance, outperforming baselines drastically. We further investigate the rationales behind the improvement, finding that more compact representations through margin-based contrastive learning bring the improvement. We release our code to the community for future research.
Anthology ID:
2021.emnlp-main.84
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1100–1111
Language:
URL:
https://aclanthology.org/2021.emnlp-main.84
DOI:
10.18653/v1/2021.emnlp-main.84
Bibkey:
Cite (ACL):
Wenxuan Zhou, Fangyu Liu, and Muhao Chen. 2021. Contrastive Out-of-Distribution Detection for Pretrained Transformers. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 1100–1111, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Contrastive Out-of-Distribution Detection for Pretrained Transformers (Zhou et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.84.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.84.mp4
Code
 wzhouad/Contra-OOD
Data
IMDb Movie ReviewsMultiNLISST-2WMT 2016