Improving compositional generalization for multi-step quantitative reasoning in question answering

Armineh Nourbakhsh, Cathy Jiao, Sameena Shah, Carolyn Rosé


Abstract
Quantitative reasoning is an important aspect of question answering, especially when numeric and verbal cues interact to indicate sophisticated, multi-step programs. In this paper, we demonstrate how modeling the compositional nature of quantitative text can enhance the performance and robustness of QA models, allowing them to capture arithmetic logic that is expressed verbally. Borrowing from the literature on semantic parsing, we propose a method that encourages the QA models to adjust their attention patterns and capture input/output alignments that are meaningful to the reasoning task. We show how this strategy improves program accuracy and renders the models more robust against overfitting as the number of reasoning steps grows. Our approach is designed as a standalone module which can be prepended to many existing models and trained in an end-to-end fashion without the need for additional supervisory signal. As part of this exercise, we also create a unified dataset building on four previously released numerical QA datasets over tabular data.
Anthology ID:
2022.emnlp-main.125
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1916–1932
Language:
URL:
https://aclanthology.org/2022.emnlp-main.125
DOI:
10.18653/v1/2022.emnlp-main.125
Bibkey:
Cite (ACL):
Armineh Nourbakhsh, Cathy Jiao, Sameena Shah, and Carolyn Rosé. 2022. Improving compositional generalization for multi-step quantitative reasoning in question answering. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1916–1932, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Improving compositional generalization for multi-step quantitative reasoning in question answering (Nourbakhsh et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.125.pdf