Probing the Capacity of Language Model Agents to Operationalize Disparate Experiential Context Despite Distraction

Sonny George, Chris Sypherd, Dylan Cashman


Abstract
Large language model (LLM) agents show promise in an increasing number of domains. In many proposed applications, it is expected that the agent reasons over accumulated experience presented in an input prompt. We propose the OEDD (Operationalize Experience Despite Distraction) corpus, a human-annotator-validated body of scenarios with pre-scripted agent histories where the agent must make a decision based on disparate experiential information in the presence of a distractor. We evaluate three state-of-the-art LLMs (GPT-3.5 Turbo, GPT-4o, and Gemini 1.5 Pro) using a minimal chain-of-thought prompting strategy and observe that when (1) the input context contains over 1,615 tokens of historical interactions, (2) a crucially decision-informing premise is the rightful conclusion over two disparate environment premises, and (3) a trivial, but distracting red herring fact follows, all LLMs perform worse than random choice at selecting the better of two actions. Our code and test corpus are publicly available at: [github.com/sonnygeorge/OEDD](github.com/sonnygeorge/OEDD).
Anthology ID:
2024.findings-emnlp.905
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15447–15459
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.905
DOI:
Bibkey:
Cite (ACL):
Sonny George, Chris Sypherd, and Dylan Cashman. 2024. Probing the Capacity of Language Model Agents to Operationalize Disparate Experiential Context Despite Distraction. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 15447–15459, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Probing the Capacity of Language Model Agents to Operationalize Disparate Experiential Context Despite Distraction (George et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.905.pdf
Software:
 2024.findings-emnlp.905.software.zip
Data:
 2024.findings-emnlp.905.data.zip