EPT-X: An Expression-Pointer Transformer model that generates eXplanations for numbers

Bugeun Kim, Kyung Seo Ki, Sangkyu Rhim, Gahgene Gweon


Abstract
In this paper, we propose a neural model EPT-X (Expression-Pointer Transformer with Explanations), which utilizes natural language explanations to solve an algebraic word problem. To enhance the explainability of the encoding process of a neural model, EPT-X adopts the concepts of plausibility and faithfulness which are drawn from math word problem solving strategies by humans. A plausible explanation is one that includes contextual information for the numbers and variables that appear in a given math word problem. A faithful explanation is one that accurately represents the reasoning process behind the model’s solution equation. The EPT-X model yields an average baseline performance of 69.59% on our PEN dataset and produces explanations with quality that is comparable to human output. The contribution of this work is two-fold. (1) EPT-X model: An explainable neural model that sets a baseline for algebraic word problem solving task, in terms of model’s correctness, plausibility, and faithfulness. (2) New dataset: We release a novel dataset PEN (Problems with Explanations for Numbers), which expands the existing datasets by attaching explanations to each number/variable.
Anthology ID:
2022.acl-long.305
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4442–4458
Language:
URL:
https://aclanthology.org/2022.acl-long.305
DOI:
10.18653/v1/2022.acl-long.305
Bibkey:
Cite (ACL):
Bugeun Kim, Kyung Seo Ki, Sangkyu Rhim, and Gahgene Gweon. 2022. EPT-X: An Expression-Pointer Transformer model that generates eXplanations for numbers. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4442–4458, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
EPT-X: An Expression-Pointer Transformer model that generates eXplanations for numbers (Kim et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.305.pdf
Software:
 2022.acl-long.305.software.tgz
Video:
 https://aclanthology.org/2022.acl-long.305.mp4
Code
 snucclab/ept-x
Data
PENALG514DRAW-1KMAWPS