Bag of Tricks for In-Distribution Calibration of Pretrained Transformers

Jaeyoung Kim, Dongbin Na, Sungchul Choi, Sungbin Lim


Abstract
While pre-trained language models (PLMs) have become a de-facto standard promoting the accuracy of text classification tasks, recent studies find that PLMs often predict over-confidently. Although calibration methods have been proposed, such as ensemble learning and data augmentation, most of the methods have been verified in computer vision benchmarks rather than in PLM-based text classification tasks. In this paper, we present an empirical study on confidence calibration for PLMs, addressing three categories, including confidence penalty losses, data augmentations, and ensemble methods. We find that the ensemble model overfitted to the training set shows sub-par calibration performance and also observe that PLMs trained with confidence penalty loss have a trade-off between calibration and accuracy. Building on these observations, we propose the Calibrated PLM (CALL), a combination of calibration techniques. The CALL complements shortcomings that may occur when utilizing a calibration method individually and boosts both classification and calibration accuracy. Design choices in CALL’s training procedures are extensively studied, and we provide a detailed analysis of how calibration techniques affect the calibration performance of PLMs.
Anthology ID:
2023.findings-eacl.40
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
551–563
Language:
URL:
https://aclanthology.org/2023.findings-eacl.40
DOI:
10.18653/v1/2023.findings-eacl.40
Bibkey:
Cite (ACL):
Jaeyoung Kim, Dongbin Na, Sungchul Choi, and Sungbin Lim. 2023. Bag of Tricks for In-Distribution Calibration of Pretrained Transformers. In Findings of the Association for Computational Linguistics: EACL 2023, pages 551–563, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Bag of Tricks for In-Distribution Calibration of Pretrained Transformers (Kim et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.40.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.40.mp4