Large Language Models Can be Lazy Learners: Analyze Shortcuts in In-Context Learning

Ruixiang Tang, Dehan Kong, Longtao Huang, Hui Xue


Abstract
Large language models (LLMs) have recently shown great potential for in-context learning, where LLMs learn a new task simply by conditioning on a few input-label pairs (prompts). Despite their potential, our understanding of the factors influencing end-task performance and the robustness of in-context learning remains limited. This paper aims to bridge this knowledge gap by investigating the reliance of LLMs on shortcuts or spurious correlations within prompts. Through comprehensive experiments on classification and extraction tasks, we reveal that LLMs are “lazy learners” that tend to exploit such shortcuts. Additionally, we uncover a surprising finding that larger models are more likely to utilize shortcuts in prompts during inference. Our findings provide a new perspective on evaluating robustness in in-context learning and pose new challenges for detecting and mitigating the use of shortcuts in prompts.
Anthology ID:
2023.findings-acl.284
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4645–4657
Language:
URL:
https://aclanthology.org/2023.findings-acl.284
DOI:
10.18653/v1/2023.findings-acl.284
Bibkey:
Cite (ACL):
Ruixiang Tang, Dehan Kong, Longtao Huang, and Hui Xue. 2023. Large Language Models Can be Lazy Learners: Analyze Shortcuts in In-Context Learning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4645–4657, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Large Language Models Can be Lazy Learners: Analyze Shortcuts in In-Context Learning (Tang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.284.pdf
Video:
 https://aclanthology.org/2023.findings-acl.284.mp4