Solving Arithmetic Word Problems Using Transformer and Pre-processing of Problem Texts

Kaden Griffith, Jugal Kalita


Abstract
This paper outlines the use of Transformer networks trained to translate math word problems to equivalent arithmetic expressions in infix, prefix, and postfix notations. We compare results produced by a large number of neural configurations and find that most configurations outperform previously reported approaches on three of four datasets with significant increases in accuracy of over 20 percentage points. The best neural approaches boost accuracy by 30% on average when compared to the previous state-of-the-art.
Anthology ID:
2020.icon-main.10
Volume:
Proceedings of the 17th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2020
Address:
Indian Institute of Technology Patna, Patna, India
Editors:
Pushpak Bhattacharyya, Dipti Misra Sharma, Rajeev Sangal
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
76–84
Language:
URL:
https://aclanthology.org/2020.icon-main.10
DOI:
Bibkey:
Cite (ACL):
Kaden Griffith and Jugal Kalita. 2020. Solving Arithmetic Word Problems Using Transformer and Pre-processing of Problem Texts. In Proceedings of the 17th International Conference on Natural Language Processing (ICON), pages 76–84, Indian Institute of Technology Patna, Patna, India. NLP Association of India (NLPAI).
Cite (Informal):
Solving Arithmetic Word Problems Using Transformer and Pre-processing of Problem Texts (Griffith & Kalita, ICON 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.icon-main.10.pdf
Data
MAWPS