Empower Nested Boolean Logic via Self-Supervised Curriculum Learning

Hongqiu Wu, Linfeng Liu, Hai Zhao, Min Zhang


Abstract
Beyond the great cognitive powers showcased by language models, it is crucial to scrutinize whether their reasoning capabilities stem from strong generalization or merely exposure to relevant data. As opposed to constructing increasingly complex logic, this paper probes into the boolean logic, the root capability of a logical reasoner. We find that any pre-trained language models even including large language models only behave like a random selector in the face of multi-nested boolean logic, a task that humans can handle with ease. To empower language models with this fundamental capability, this paper proposes a new self-supervised learning method Curriculum Logical Reasoning (Clr), where we augment the training data with nested boolean logic chain step-by-step, and program the training from simpler logical patterns gradually to harder ones. This new training paradigm allows language models to effectively generalize to much harder and longer-hop logic, which can hardly be learned through naive training. Furthermore, we show that boolean logic is a great foundation for improving the subsequent general logical tasks.
Anthology ID:
2023.emnlp-main.847
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13731–13742
Language:
URL:
https://aclanthology.org/2023.emnlp-main.847
DOI:
10.18653/v1/2023.emnlp-main.847
Bibkey:
Cite (ACL):
Hongqiu Wu, Linfeng Liu, Hai Zhao, and Min Zhang. 2023. Empower Nested Boolean Logic via Self-Supervised Curriculum Learning. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 13731–13742, Singapore. Association for Computational Linguistics.
Cite (Informal):
Empower Nested Boolean Logic via Self-Supervised Curriculum Learning (Wu et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.847.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.847.mp4