Show Your Work with Confidence: Confidence Bands for Tuning Curves

Nicholas Lourie, Kyunghyun Cho, He He


Abstract
The choice of hyperparameters greatly impacts performance in natural language processing. Often, it is hard to tell if a method is better than another or just better tuned. *Tuning curves* fix this ambiguity by accounting for tuning effort. Specifically, they plot validation performance as a function of the number of hyperparameter choices tried so far. While several estimators exist for these curves, it is common to use point estimates, which we show fail silently and give contradictory results when given too little data.Beyond point estimates, *confidence bands* are necessary to rigorously establish the relationship between different approaches. We present the first method to construct valid confidence bands for tuning curves. The bands are exact, simultaneous, and distribution-free, thus they provide a robust basis for comparing methods.Empirical analysis shows that while bootstrap confidence bands, which serve as a baseline, fail to approximate their target confidence, ours achieve it exactly. We validate our design with ablations, analyze the effect of sample size, and provide guidance on comparing models with our method. To promote confident comparisons in future work, we release opda: an easy-to-use library that you can install with pip. https://github.com/nicholaslourie/opda
Anthology ID:
2024.naacl-long.189
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3455–3472
Language:
URL:
https://aclanthology.org/2024.naacl-long.189
DOI:
10.18653/v1/2024.naacl-long.189
Bibkey:
Cite (ACL):
Nicholas Lourie, Kyunghyun Cho, and He He. 2024. Show Your Work with Confidence: Confidence Bands for Tuning Curves. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 3455–3472, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Show Your Work with Confidence: Confidence Bands for Tuning Curves (Lourie et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.189.pdf