A Notion of Complexity for Theory of Mind via Discrete World Models

X. Huang, Emanuele La Malfa, Samuele Marro, Andrea Asperti, Anthony Cohn, Michael Wooldridge


Abstract
Theory of Mind (ToM) can be used to assess the capabilities of Large Language Models (LLMs) in complex scenarios where social reasoning is required. While the research community has proposed many ToM benchmarks, their hardness varies greatly, and their complexity is not well defined. This work proposes a framework inspired by cognitive load theory to measure the complexity of ToM tasks. We quantify a problem’s complexity as the number of states necessary to solve it correctly. Our complexity measure also accounts for spurious states of a ToM problem designed to make it apparently harder. We use our method to assess the complexity of five widely adopted ToM benchmarks. On top of this framework, we design a prompting technique that augments the information available to a model with a description of how the environment changes with the agents’ interactions. We name this technique Discrete World Models (DWM) and show how it elicits superior performance on ToM tasks.
Anthology ID:
2024.findings-emnlp.167
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2964–2983
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.167
DOI:
Bibkey:
Cite (ACL):
X. Huang, Emanuele La Malfa, Samuele Marro, Andrea Asperti, Anthony Cohn, and Michael Wooldridge. 2024. A Notion of Complexity for Theory of Mind via Discrete World Models. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 2964–2983, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
A Notion of Complexity for Theory of Mind via Discrete World Models (Huang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.167.pdf
Software:
 2024.findings-emnlp.167.software.zip