Qingpeng Nong
2023
An Expression Tree Decoding Strategy for Mathematical Equation Generation
Wenqi Zhang

Yongliang Shen

Qingpeng Nong

Zeqi Tan

Yanna Ma

Weiming Lu
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Generating mathematical equations from natural language requires an accurate understanding of the relations among math expressions. Existing approaches can be broadly categorized into tokenlevel and expressionlevel generation. The former treats equations as a mathematical language, sequentially generating math tokens. Expressionlevel methods generate each expression one by one. However, each expression represents a solving step, and there naturally exist parallel or dependent relations between these steps, which are ignored by current sequential methods. Therefore, we integrate tree structure into the expressionlevel generation and advocate an expression tree decoding strategy. To generate a tree with expression as its node, we employ a layerwise parallel decoding strategy: we decode multiple independent expressions (leaf nodes) in parallel at each layer and repeat parallel decoding layer by layer to sequentially generate these parent node expressions that depend on others. Besides, a bipartite matching algorithm is adopted to align multiple predictions with annotations for each layer. Experiments show our method outperforms other baselines, especially for these equations with complex structures.
2022
MultiView Reasoning: Consistent Contrastive Learning for Math Word Problem
Wenqi Zhang

Yongliang Shen

Yanna Ma

Xiaoxia Cheng

Zeqi Tan

Qingpeng Nong

Weiming Lu
Findings of the Association for Computational Linguistics: EMNLP 2022
Math word problem solver requires both precise relation reasoning about quantities in the text and reliable generation for the diverse equation. Current sequencetotree or relation extraction methods regard this only from a fixed view, struggling to simultaneously handle complex semantics and diverse equations. However, human solving naturally involves two consistent reasoning views: topdown and bottomup, just as math equations also can be expressed in multiple equivalent forms: preorder and postorder. We propose a multiview consistent contrastive learning for a more complete semanticstoequation mapping. The entire process is decoupled into two independent but consistent views: topdown decomposition and bottomup construction, and the two reasoning views are aligned in multigranularity for consistency, enhancing global generation and precise reasoning. Experiments on multiple datasets across two languages show our approach significantly outperforms the existing baselines, especially on complex problems. We also show after consistent alignment, multiview can absorb the merits of both views and generate more diverse results consistent with the mathematical laws.
Search
Coauthors
 Wenqi Zhang 2
 Yongliang Shen 2
 Yanna Ma 2
 Zeqi Tan 2
 Weiming Lu 2
 show all...