Enhancing Zero-shot Chain of Thought Prompting via Uncertainty-Guided Strategy Selection

Shanu Kumar, Saish Mendke, Karody Lubna Abdul Rahman, Santosh Kurasa, Parag Agrawal, Sandipan Dandapat


Abstract
Chain-of-thought (CoT) prompting has significantly enhanced the the capability of large language models (LLMs) by structuring their reasoning processes. However, existing methods face critical limitations: handcrafted demonstrations require extensive human expertise, while trigger phrases are prone to inaccuracies. In this paper, we propose the Zero-shot Uncertainty-based Selection (ZEUS) method, a novel approach that improves CoT prompting by utilizing uncertainty estimates to select effective demonstrations without needing access to model parameters. Unlike traditional methods, ZEUS offers high sensitivity in distinguishing between helpful and ineffective questions, ensuring more precise and reliable selection. Our extensive evaluation shows that ZEUS consistently outperforms existing CoT strategies across four challenging reasoning benchmarks, demonstrating its robustness and scalability.
Anthology ID:
2025.coling-main.137
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2003–2025
Language:
URL:
https://aclanthology.org/2025.coling-main.137/
DOI:
Bibkey:
Cite (ACL):
Shanu Kumar, Saish Mendke, Karody Lubna Abdul Rahman, Santosh Kurasa, Parag Agrawal, and Sandipan Dandapat. 2025. Enhancing Zero-shot Chain of Thought Prompting via Uncertainty-Guided Strategy Selection. In Proceedings of the 31st International Conference on Computational Linguistics, pages 2003–2025, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Enhancing Zero-shot Chain of Thought Prompting via Uncertainty-Guided Strategy Selection (Kumar et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.137.pdf