Exploiting Numerical-Contextual Knowledge to Improve Numerical Reasoning in Question Answering

Jeonghwan Kim, Junmo Kang, Kyung-min Kim, Giwon Hong, Sung-Hyon Myaeng


Abstract
Numerical reasoning over text is a challenging subtask in question answering (QA) that requires both the understanding of texts and numbers. However, existing language models in these numerical reasoning QA models tend to overly rely on the pre-existing parametric knowledge at inference time, which commonly causes hallucination in interpreting numbers. Our work proposes a novel attention masked reasoning model, the NC-BERT, that learns to leverage the number-related contextual knowledge to alleviate the over-reliance on parametric knowledge and enhance the numerical reasoning capabilities of the QA model. The empirical results suggest that understanding of numbers in their context by reducing the parametric knowledge influence, and refining numerical information in the number embeddings lead to improved numerical reasoning accuracy and performance in DROP, a numerical QA dataset.
Anthology ID:
2022.findings-naacl.138
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1811–1821
Language:
URL:
https://aclanthology.org/2022.findings-naacl.138
DOI:
10.18653/v1/2022.findings-naacl.138
Bibkey:
Cite (ACL):
Jeonghwan Kim, Junmo Kang, Kyung-min Kim, Giwon Hong, and Sung-Hyon Myaeng. 2022. Exploiting Numerical-Contextual Knowledge to Improve Numerical Reasoning in Question Answering. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1811–1821, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Exploiting Numerical-Contextual Knowledge to Improve Numerical Reasoning in Question Answering (Kim et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.138.pdf
Software:
 2022.findings-naacl.138.software.zip
Video:
 https://aclanthology.org/2022.findings-naacl.138.mp4
Data
DROP