Open World Classification with Adaptive Negative Samples

Ke Bai, Guoyin Wang, Jiwei Li, Sunghyun Park, Sungjin Lee, Puyang Xu, Ricardo Henao, Lawrence Carin


Abstract
Open world classification is a task in natural language processing with key practical relevance and impact. Since the open or unknown category data only manifests in the inference phase, finding a model with a suitable decision boundary accommodating for the identification of known classes and discrimination of the open category is challenging. The performance of existing models is limited by the lack of effective open category data during the training stage or the lack of a good mechanism to learn appropriate decision boundaries. We propose an approach based on Adaptive Negative Samples (ANS) designed to generate effective synthetic open category samples in the training stage and without requiring any prior knowledge or external datasets. Empirically, we find a significant advantage in using auxiliary one-versus-rest binary classifiers, which effectively utilize the generated negative samples and avoid the complex threshold-seeking stage in previous works. Extensive experiments on three benchmark datasets show that ANS achieves significant improvements over state-of-the-art methods.
Anthology ID:
2022.emnlp-main.295
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4378–4392
Language:
URL:
https://aclanthology.org/2022.emnlp-main.295
DOI:
10.18653/v1/2022.emnlp-main.295
Bibkey:
Cite (ACL):
Ke Bai, Guoyin Wang, Jiwei Li, Sunghyun Park, Sungjin Lee, Puyang Xu, Ricardo Henao, and Lawrence Carin. 2022. Open World Classification with Adaptive Negative Samples. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 4378–4392, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Open World Classification with Adaptive Negative Samples (Bai et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.295.pdf