MathQA: Towards Interpretable Math Word Problem Solving with Operation-Based Formalisms
Aida
Amini
author
Saadia
Gabriel
author
Shanchuan
Lin
author
Rik
Koncel-Kedziorski
author
Yejin
Choi
author
Hannaneh
Hajishirzi
author
2019-06
text
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Jill
Burstein
editor
Christy
Doran
editor
Thamar
Solorio
editor
Association for Computational Linguistics
Minneapolis, Minnesota
conference publication
We introduce a large-scale dataset of math word problems and an interpretable neural math problem solver by learning to map problems to their operation programs. Due to annotation challenges, current datasets in this domain have been either relatively small in scale or did not offer precise operational annotations over diverse problem types. We introduce a new representation language to model operation programs corresponding to each math problem that aim to improve both the performance and the interpretability of the learned models. Using this representation language, we significantly enhance the AQUA-RAT dataset with fully-specified operational programs. We additionally introduce a neural sequence-to-program model with automatic problem categorization. Our experiments show improvements over competitive baselines in our dataset as well as the AQUA-RAT dataset. The results are still lower than human performance indicating that the dataset poses new challenges for future research. Our dataset is available at https://math-qa.github.io/math-QA/
amini-etal-2019-mathqa
10.18653/v1/N19-1245
https://aclanthology.org/N19-1245
2019-06
2357
2367