Crafting In-context Examples according to LMs’ Parametric Knowledge

Yoonsang Lee, Pranav Atreya, Xi Ye, Eunsol Choi


Abstract
In-context learning can improve the performances of knowledge-rich tasks such as question answering. In such scenarios, in-context examples trigger a language model (LM) to surface information stored in its parametric knowledge. We study how to better construct in-context example sets, based on whether the model is aware of the in-context examples. We identify ‘known’ examples, where models can correctly answer from their parametric knowledge, and ‘unknown’ ones. Our experiments show that prompting with ‘unknown’ examples decreases the performance, potentially as it encourages hallucination rather than searching for its parametric knowledge. Constructing an in-context example set that presents both known and unknown information performs the best across diverse settings. We perform analysis on three multi-answer question answering datasets, which allows us to further study answer set ordering strategies based on the LM’s knowledge of each answer. Together, our study sheds light on how to best construct in-context example sets for knowledge-rich tasks.
Anthology ID:
2024.findings-naacl.133
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2069–2085
Language:
URL:
https://aclanthology.org/2024.findings-naacl.133
DOI:
Bibkey:
Cite (ACL):
Yoonsang Lee, Pranav Atreya, Xi Ye, and Eunsol Choi. 2024. Crafting In-context Examples according to LMs’ Parametric Knowledge. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 2069–2085, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Crafting In-context Examples according to LMs’ Parametric Knowledge (Lee et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.133.pdf
Copyright:
 2024.findings-naacl.133.copyright.pdf