Tab-CoT: Zero-shot Tabular Chain of Thought

Jin Ziqi, Wei Lu


Abstract
The chain-of-though (CoT) prompting methods were successful in various natural language processing (NLP) tasks thanks to their ability to unveil the underlying complex reasoning processes. Such reasoning processes typically exhibit highly structured steps. Recent efforts also started investigating methods to encourage more structured reasoning procedures to be captured (cite least to most).In this work, we propose Tab-CoT, a novel tabular-format CoT prompting method, which allows the complex reasoning process to be explicitly modeled in a highly structured manner. Despite its simplicity, we show that our approach is capable of performing reasoning across multiple dimensions (i.e., both rows and columns).We demonstrate our approach’s strong zero-shot and few-shot capabilities through extensive experiments on a range of reasoning tasks.
Anthology ID:
2023.findings-acl.651
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10259–10277
Language:
URL:
https://aclanthology.org/2023.findings-acl.651
DOI:
10.18653/v1/2023.findings-acl.651
Bibkey:
Cite (ACL):
Jin Ziqi and Wei Lu. 2023. Tab-CoT: Zero-shot Tabular Chain of Thought. In Findings of the Association for Computational Linguistics: ACL 2023, pages 10259–10277, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Tab-CoT: Zero-shot Tabular Chain of Thought (Ziqi & Lu, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.651.pdf