NUMCoT: Numerals and Units of Measurement in Chain-of-Thought Reasoning using Large Language Models

Ancheng Xu, Minghuan Tan, Lei Wang, Min Yang, Ruifeng Xu


Abstract
Numeral systems and units of measurement are two conjoined topics in activities of human beings and have mutual effects with the languages expressing them. Currently, the evaluation of Large Language Models (LLMs) often involves mathematical reasoning, yet little attention is given to how minor changes in numbers or units can drastically alter the complexity of problems and the performance of LLMs. In this paper, we scrutinize existing LLMs on processing of numerals and units of measurement by constructing datasets with perturbations. We first anatomize the reasoning of math word problems to different sub-procedures like numeral conversions from language to numbers and measurement conversions based on units. Then we further annotate math word problems from ancient Chinese arithmetic works which are challenging in numerals and units of measurement. Experiments on perturbed datasets demonstrate that LLMs still encounter difficulties in handling numeral and measurement conversions.
Anthology ID:
2024.findings-acl.848
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14268–14290
Language:
URL:
https://aclanthology.org/2024.findings-acl.848
DOI:
10.18653/v1/2024.findings-acl.848
Bibkey:
Cite (ACL):
Ancheng Xu, Minghuan Tan, Lei Wang, Min Yang, and Ruifeng Xu. 2024. NUMCoT: Numerals and Units of Measurement in Chain-of-Thought Reasoning using Large Language Models. In Findings of the Association for Computational Linguistics: ACL 2024, pages 14268–14290, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
NUMCoT: Numerals and Units of Measurement in Chain-of-Thought Reasoning using Large Language Models (Xu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.848.pdf