Beyond Chain-of-Thought: A Survey of Chain-of-X Paradigms for LLMs

Yu Xia, Rui Wang, Xu Liu, Mingyan Li, Tong Yu, Xiang Chen, Julian McAuley, Shuai Li


Abstract
Chain-of-Thought (CoT) has been a widely adopted prompting method, eliciting impressive reasoning abilities of Large Language Models (LLMs). Inspired by the sequential thought structure of CoT, a number of Chain-of-X (CoX) methods have been developed to address challenges across diverse domains and tasks. In this paper, we provide a comprehensive survey of Chain-of-X methods for LLMs in different contexts. Specifically, we categorize them by taxonomies of nodes, i.e., the X in CoX, and application tasks. We also discuss the findings and implications of existing CoX methods, as well as potential future directions. Our survey aims to serve as a detailed and up-to-date resource for researchers seeking to apply the idea of CoT to broader scenarios.
Anthology ID:
2025.coling-main.719
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10795–10809
Language:
URL:
https://aclanthology.org/2025.coling-main.719/
DOI:
Bibkey:
Cite (ACL):
Yu Xia, Rui Wang, Xu Liu, Mingyan Li, Tong Yu, Xiang Chen, Julian McAuley, and Shuai Li. 2025. Beyond Chain-of-Thought: A Survey of Chain-of-X Paradigms for LLMs. In Proceedings of the 31st International Conference on Computational Linguistics, pages 10795–10809, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Beyond Chain-of-Thought: A Survey of Chain-of-X Paradigms for LLMs (Xia et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.719.pdf