Jiayuan Su


2024

pdf bib
API Is Enough: Conformal Prediction for Large Language Models Without Logit-Access
Jiayuan Su | Jing Luo | Hongwei Wang | Lu Cheng
Findings of the Association for Computational Linguistics: EMNLP 2024

This study aims to address the pervasive challenge of quantifying uncertainty in large language models (LLMs) with black-box API access. Conformal Prediction (CP), known for its model-agnostic and distribution-free features, is a desired approach for various LLMs and data distributions. However, existing CP methods for LLMs typically assume access to the logits, which are unavailable for some API-only LLMs. In addition, logits are known to be miscalibrated, potentially leading to degraded CP performance. To tackle these challenges, we introduce a novel CP method that (1) is tailored for API-only LLMs without logit-access; (2) minimizes the size of prediction sets; and (3) ensures a statistical guarantee of the user-defined coverage. The core idea of this approach is to formulate nonconformity measures using both coarse-grained (i.e., sample frequency) and fine-grained uncertainty notions (e.g., semantic similarity). Experimental results on both close-ended and open-ended Question Answering tasks show our approach can mostly outperform the logit-based CP baselines.