Dror Marcus
2025
DoubleDipper: Recycling Contexts for Efficient and Attributed In-Context Learning
Arie Cattan
|
Alon Jacovi
|
Alex Fabrikant
|
Jonathan Herzig
|
Roee Aharoni
|
Hannah Rashkin
|
Dror Marcus
|
Avinatan Hassidim
|
Yossi Matias
|
Idan Szpektor
|
Avi Caciularu
Proceedings of the 14th International Joint Conference on Natural Language Processing and the 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics
In this work, we propose DoubleDipper, a novel In-Context-Learning method that automatically generates few-shot examples for several QA tasks by _recycling_ contexts. Specifically, given an input context (1-3k tokens) and a query, we generate additional query-output pairs from the given context as few-shot examples, while introducing the context only once. This ensures that the demonstrations are leveraging the same context as the target query while only adding a small number of tokens to the prompt. We further enhance each demonstration by instructing the model to _explicitly_ identify the relevant paragraphs before the answer, which improves performance while providing fine-grained attribution to the answer source. We apply our method on multiple LLMs and obtain substantial improvements (+16 absolute points on average across models) on various QA datasets. Surprisingly, despite introducing only single-hop ICL examples, LLMs successfully generalize to multi-hop QA using our approach.
Search
Fix author
Co-authors
- Roee Aharoni 1
- Avi Caciularu 1
- Arie Cattan 1
- Alex Fabrikant 1
- Avinatan Hassidim 1
- show all...