Effective Demonstration Annotation for In-Context Learning via Language Model-Based Determinantal Point Process

Peng Wang, Xiaobin Wang, Chao Lou, Shengyu Mao, Pengjun Xie, Yong Jiang


Abstract
In-context learning (ICL) is a few-shot learning paradigm that involves learning mappings through input-output pairs and appropriately applying them to new instances. Despite the remarkable ICL capabilities demonstrated by Large Language Models (LLMs), existing works are highly dependent on large-scale labeled support sets, not always feasible in practical scenarios. To refine this approach, we focus primarily on an innovative selective annotation mechanism, which precedes the standard demonstration retrieval. We introduce the Language Model-based Determinant Point Process (LM-DPP) that simultaneously considers the uncertainty and diversity of unlabeled instances for optimal selection. Consequently, this yields a subset for annotation that strikes a trade-off between the two factors. We apply LM-DPP to various language models, including GPT-J, LlaMA, and GPT-3. Experimental results on 9 NLU and 2 Generation datasets demonstrate that LM-DPP can effectively select canonical examples. Further analysis reveals that LLMs benefit most significantly from subsets that are both low uncertainty and high diversity.
Anthology ID:
2024.emnlp-main.74
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1266–1280
Language:
URL:
https://aclanthology.org/2024.emnlp-main.74
DOI:
10.18653/v1/2024.emnlp-main.74
Bibkey:
Cite (ACL):
Peng Wang, Xiaobin Wang, Chao Lou, Shengyu Mao, Pengjun Xie, and Yong Jiang. 2024. Effective Demonstration Annotation for In-Context Learning via Language Model-Based Determinantal Point Process. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 1266–1280, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Effective Demonstration Annotation for In-Context Learning via Language Model-Based Determinantal Point Process (Wang et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.74.pdf
Software:
 2024.emnlp-main.74.software.zip