Early Stopping Based on Unlabeled Samples in Text Classification

HongSeok Choi, Dongha Choi, Hyunju Lee


Abstract
Early stopping, which is widely used to prevent overfitting, is generally based on a separate validation set. However, in low resource settings, validation-based stopping can be risky because a small validation set may not be sufficiently representative, and the reduction in the number of samples by validation split may result in insufficient samples for training. In this study, we propose an early stopping method that uses unlabeled samples. The proposed method is based on confidence and class distribution similarities. To further improve the performance, we present a calibration method to better estimate the class distribution of the unlabeled samples. The proposed method is advantageous because it does not require a separate validation set and provides a better stopping point by using a large unlabeled set. Extensive experiments are conducted on five text classification datasets and several stop-methods are compared. Our results show that the proposed model even performs better than using an additional validation set as well as the existing stop-methods, in both balanced and imbalanced data settings. Our code is available at https://github.com/DMCB-GIST/BUS-stop.
Anthology ID:
2022.acl-long.52
Original:
2022.acl-long.52v1
Version 2:
2022.acl-long.52v2
Version 3:
2022.acl-long.52v3
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
708–718
Language:
URL:
https://aclanthology.org/2022.acl-long.52
DOI:
10.18653/v1/2022.acl-long.52
Bibkey:
Cite (ACL):
HongSeok Choi, Dongha Choi, and Hyunju Lee. 2022. Early Stopping Based on Unlabeled Samples in Text Classification. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 708–718, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Early Stopping Based on Unlabeled Samples in Text Classification (Choi et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.52.pdf
Software:
 2022.acl-long.52.software.zip
Video:
 https://aclanthology.org/2022.acl-long.52.mp4
Code
 dmcb-gist/bus-stop
Data
AG NewsIMDb Movie ReviewsSSTSST-2