A Neural-Symbolic Approach to Natural Language Understanding

Zhixuan Liu, Zihao Wang, Yuan Lin, Hang Li


Abstract
Deep neural networks, empowered by pre-trained language models, have achieved remarkable results in natural language understanding (NLU) tasks. However, their performances can drastically deteriorate when logical reasoning is needed. This is because NLU in principle depends on not only analogical reasoning, which deep neural networks are good at, but also logical reasoning. According to the dual-process theory, analogical reasoning and logical reasoning are respectively carried out by System 1 and System 2 in the human brain. Inspired by the theory, we present a novel framework for NLU called Neural-Symbolic Processor (NSP), which performs analogical reasoning based on neural processing and logical reasoning based on both neural and symbolic processing. As a case study, we conduct experiments on two NLU tasks, question answering (QA) and natural language inference (NLI), when numerical reasoning (a type of logical reasoning) is necessary. The experimental results show that our method significantly outperforms state-of-the-art methods in both tasks.
Anthology ID:
2022.findings-emnlp.158
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2159–2172
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.158
DOI:
10.18653/v1/2022.findings-emnlp.158
Bibkey:
Cite (ACL):
Zhixuan Liu, Zihao Wang, Yuan Lin, and Hang Li. 2022. A Neural-Symbolic Approach to Natural Language Understanding. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2159–2172, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
A Neural-Symbolic Approach to Natural Language Understanding (Liu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.158.pdf