More room for language: Investigating the effect of retrieval on language models

David Samuel, Lucas Charpentier, Sondre Wold


Abstract
Retrieval-augmented language models pose a promising alternative to standard language modeling. During pretraining, these models search in a corpus of documents for contextually relevant information that could aid the language modeling objective. We introduce an ‘ideal retrieval’ methodology to study these models in a fully controllable setting. We conduct an extensive evaluation to examine how retrieval augmentation affects the behavior of the underlying language model. Among other things, we observe that these models: (i) save substantially less world knowledge in their weights, (ii) are better at understanding local context and inter-word dependencies, but (iii) are worse at comprehending global context.
Anthology ID:
2024.naacl-short.26
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
282–305
Language:
URL:
https://aclanthology.org/2024.naacl-short.26
DOI:
10.18653/v1/2024.naacl-short.26
Bibkey:
Cite (ACL):
David Samuel, Lucas Charpentier, and Sondre Wold. 2024. More room for language: Investigating the effect of retrieval on language models. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 282–305, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
More room for language: Investigating the effect of retrieval on language models (Samuel et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-short.26.pdf