When Context Leads but Parametric Memory Follows in Large Language Models

Yufei Tao, Adam Hiatt, Erik Haake, Antonie Jetter, Ameeta Agrawal


Abstract
Large language models (LLMs) have demonstrated remarkable progress in leveraging diverse knowledge sources. This study investigates how nine widely used LLMs allocate knowledge between local context and global parameters when answering open-ended questions in knowledge-consistent scenarios. We introduce a novel dataset, WikiAtomic, and systematically vary context sizes to analyze how LLMs prioritize and utilize the provided information and their parametric knowledge in knowledge-consistent scenarios. Additionally, we also study their tendency to hallucinate under varying context sizes. Our findings reveal consistent patterns across models, including a consistent reliance on both contextual (around 70%) and parametric (around 30%) knowledge, and a decrease in hallucinations with increasing context. These insights highlight the importance of more effective context organization and developing models that use input more deterministically for robust performance.
Anthology ID:
2024.emnlp-main.234
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4034–4058
Language:
URL:
https://aclanthology.org/2024.emnlp-main.234
DOI:
Bibkey:
Cite (ACL):
Yufei Tao, Adam Hiatt, Erik Haake, Antonie Jetter, and Ameeta Agrawal. 2024. When Context Leads but Parametric Memory Follows in Large Language Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 4034–4058, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
When Context Leads but Parametric Memory Follows in Large Language Models (Tao et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.234.pdf