BackMATH: Towards Backward Reasoning for Solving Math Problems Step by Step

Shaowei Zhang, Deyi Xiong


Abstract
Large language models (LLMs) have achieved impressive results in reasoning, particularly in multi-step reasoning tasks. However, when faced with more complex mathematical problems, the performance of LLMs drops significantly. To address this issue, in this paper, we propose a backward reasoning dataset, BackMATH-Data. The dataset comprises approximately 14K backward reasoning problems and 100K reasoning steps. It follows a result-oriented approach, to construct backward reasoning problems by swapping the reasoning results with specific solving conditions in the original problems.Additionally, we introduce Backward-reasoning Process-supervision Reward Model (BackPRM) and BackMATH-LLM. BackPRM supervises the quality of the generated backward reasoning problems, while BackMATH-LLM is designed for mathematical reasoning. BackMATH-LLM is fine-tuned and enhanced through reinforcement learning by supervising the quality of backward reasoning problems and by providing feedback on reasoning steps, thereby improving the mathematical reasoning capabilities of LLMs.Extensive experiments demonstrate that our model achieves an accuracy of 68.1% on the GSM8K dataset and 21.9% on the MATH dataset, exceeding the SOTA by 1.6% and 2.1% respectively.
Anthology ID:
2025.coling-industry.40
Volume:
Proceedings of the 31st International Conference on Computational Linguistics: Industry Track
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert, Kareem Darwish, Apoorv Agarwal
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
466–482
Language:
URL:
https://aclanthology.org/2025.coling-industry.40/
DOI:
Bibkey:
Cite (ACL):
Shaowei Zhang and Deyi Xiong. 2025. BackMATH: Towards Backward Reasoning for Solving Math Problems Step by Step. In Proceedings of the 31st International Conference on Computational Linguistics: Industry Track, pages 466–482, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
BackMATH: Towards Backward Reasoning for Solving Math Problems Step by Step (Zhang & Xiong, COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-industry.40.pdf