Solving Math Word Problems with Multi-Encoders and Multi-Decoders

Yibin Shen, Cheqing Jin


Abstract
Math word problems solving remains a challenging task where potential semantic and mathematical logic need to be mined from natural language. Although previous researches employ the Seq2Seq technique to transform text descriptions into equation expressions, most of them achieve inferior performance due to insufficient consideration in the design of encoder and decoder. Specifically, these models only consider input/output objects as sequences, ignoring the important structural information contained in text descriptions and equation expressions. To overcome those defects, a model with multi-encoders and multi-decoders is proposed in this paper, which combines sequence-based encoder and graph-based encoder to enhance the representation of text descriptions, and generates different equation expressions via sequence-based decoder and tree-based decoder. Experimental results on the dataset Math23K show that our model outperforms existing state-of-the-art methods.
Anthology ID:
2020.coling-main.262
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2924–2934
Language:
URL:
https://aclanthology.org/2020.coling-main.262
DOI:
10.18653/v1/2020.coling-main.262
Bibkey:
Cite (ACL):
Yibin Shen and Cheqing Jin. 2020. Solving Math Word Problems with Multi-Encoders and Multi-Decoders. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2924–2934, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Solving Math Word Problems with Multi-Encoders and Multi-Decoders (Shen & Jin, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.262.pdf
Code
 yibinshen/multimath
Data
Math23K