From Complex to Simple: Unraveling the Cognitive Tree for Reasoning with Small Language Models

Yan Junbing, Chengyu Wang, Taolin Zhang, Xiaofeng He, Jun Huang, Wei Zhang


Abstract
Reasoning is a distinctive human capacity, enabling us to address complex problems by breaking them down into a series of manageable cognitive steps. Yet, complex logical reasoning is still cumbersome for language models. Based on the dual process theory in cognitive science, we are the first to unravel the cognitive reasoning abilities of language models. Our framework employs an iterative methodology to construct a Cognitive Tree (CogTree). The root node of this tree represents the initial query, while the leaf nodes consist of straightforward questions that can be answered directly. This construction involves two main components: the implicit extraction module (referred to as the intuitive system) and the explicit reasoning module (referred to as the reflective system). The intuitive system rapidly generates multiple responses by utilizing in-context examples, while the reflective system scores these responses using comparative learning. The scores guide the intuitive system in its subsequent generation step.Our experimental results on two popular and challenging reasoning tasks indicate that it is possible to achieve a performance level comparable to that of GPT-3.5 (with 175B parameters), using a significantly smaller language model that contains fewer parameters (<=7B) than 5% of GPT-3.5.
Anthology ID:
2023.findings-emnlp.828
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12413–12425
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.828
DOI:
10.18653/v1/2023.findings-emnlp.828
Bibkey:
Cite (ACL):
Yan Junbing, Chengyu Wang, Taolin Zhang, Xiaofeng He, Jun Huang, and Wei Zhang. 2023. From Complex to Simple: Unraveling the Cognitive Tree for Reasoning with Small Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 12413–12425, Singapore. Association for Computational Linguistics.
Cite (Informal):
From Complex to Simple: Unraveling the Cognitive Tree for Reasoning with Small Language Models (Junbing et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.828.pdf