Learning New Tasks from a Few Examples with Soft-Label Prototypes

Avyav Singh, Ekaterina Shutova, Helen Yannakoudakis


Abstract
Existing approaches to few-shot learning in NLP rely on large language models (LLMs) and/or fine-tuning of these to generalise on out-of-distribution data. In this work, we propose a novel few-shot learning approach based on soft-label prototypes (SLPs) designed to collectively capture the distribution of different classes across the input domain space. We focus on learning previously unseen NLP tasks from very few examples (4, 8, 16) per class and experimentally demonstrate that our approach achieves superior performance on the majority of tested tasks in this data-lean setting while being highly parameter efficient. We also show that our few-shot adaptation method can be integrated into more generalised learning settings, primarily meta-learning, to yield superior performance against strong baselines.
Anthology ID:
2024.repl4nlp-1.16
Volume:
Proceedings of the 9th Workshop on Representation Learning for NLP (RepL4NLP-2024)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Chen Zhao, Marius Mosbach, Pepa Atanasova, Seraphina Goldfarb-Tarrent, Peter Hase, Arian Hosseini, Maha Elbayad, Sandro Pezzelle, Maximilian Mozes
Venues:
RepL4NLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
215–236
Language:
URL:
https://aclanthology.org/2024.repl4nlp-1.16
DOI:
Bibkey:
Cite (ACL):
Avyav Singh, Ekaterina Shutova, and Helen Yannakoudakis. 2024. Learning New Tasks from a Few Examples with Soft-Label Prototypes. In Proceedings of the 9th Workshop on Representation Learning for NLP (RepL4NLP-2024), pages 215–236, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Learning New Tasks from a Few Examples with Soft-Label Prototypes (Singh et al., RepL4NLP-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.repl4nlp-1.16.pdf