In-context Learning and Gradient Descent Revisited

Gilad Deutch, Nadav Magar, Tomer Natan, Guy Dar


Abstract
In-context learning (ICL) has shown impressive results in few-shot learning tasks, yet its underlying mechanism is still not fully understood. A recent line of work suggests that ICL performs gradient descent (GD)-based optimization implicitly. While appealing, much of the research focuses on simplified settings, where the parameters of a shallow model are optimized. In this work, we revisit evidence for ICL-GD correspondence on realistic NLP tasks and models. We find gaps in evaluation, both in terms of problematic metrics and insufficient baselines. We show that surprisingly, even untrained models achieve comparable ICL-GD similarity scores despite not exhibiting ICL.Next, we explore a major discrepancy in the flow of information throughout the model between ICL and GD, which we term Layer Causality. We propose a simple GD-based optimization procedure that respects layer causality, and show it improves similarity scores significantly.
Anthology ID:
2024.naacl-long.58
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1017–1028
Language:
URL:
https://aclanthology.org/2024.naacl-long.58
DOI:
Bibkey:
Cite (ACL):
Gilad Deutch, Nadav Magar, Tomer Natan, and Guy Dar. 2024. In-context Learning and Gradient Descent Revisited. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 1017–1028, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
In-context Learning and Gradient Descent Revisited (Deutch et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.58.pdf
Copyright:
 2024.naacl-long.58.copyright.pdf