Lautaro Estienne


2023

pdf bib
Unsupervised Calibration through Prior Adaptation for Text Classification using Large Language Models
Lautaro Estienne
Proceedings of the 8th Student Research Workshop associated with the International Conference Recent Advances in Natural Language Processing

A wide variety of natural language tasks are currently being addressed with large-scale language models (LLMs). These models are usually trained with a very large amount of unsupervised text data and adapted to perform a downstream natural language task using methods like fine-tuning, calibration or in-context learning. In this work, we propose an approach to adapt the prior class distribution to perform text classification tasks without the need for labelled samples and only a few in-domain sample queries. The proposed approach treats the LLM as a black box, adding a stage where the model posteriors are calibrated to the task. Results show that these methods outperform the un-adapted model for different number of training shots in the prompt and a previous approach where calibration is performed without using any adaptation data.
Search
Co-authors
    Venues