Avyav Singh


2024

pdf bib
Learning New Tasks from a Few Examples with Soft-Label Prototypes
Avyav Singh | Ekaterina Shutova | Helen Yannakoudakis
Proceedings of the 9th Workshop on Representation Learning for NLP (RepL4NLP-2024)

Existing approaches to few-shot learning in NLP rely on large language models (LLMs) and/or fine-tuning of these to generalise on out-of-distribution data. In this work, we propose a novel few-shot learning approach based on soft-label prototypes (SLPs) designed to collectively capture the distribution of different classes across the input domain space. We focus on learning previously unseen NLP tasks from very few examples (4, 8, 16) per class and experimentally demonstrate that our approach achieves superior performance on the majority of tested tasks in this data-lean setting while being highly parameter efficient. We also show that our few-shot adaptation method can be integrated into more generalised learning settings, primarily meta-learning, to yield superior performance against strong baselines.