Investigating Transformer-Guided Chaining for Interpretable Natural Logic Reasoning

Kanagasabai Rajaraman, Saravanan Rajamanickam, Wei Shi


Abstract
Natural logic reasoning has received increasing attention lately, with several datasets and neural models proposed, though with limited success. More recently, a new class of works have emerged adopting a Neuro-Symbolic approach, called transformer guided chaining, whereby the idea is to iteratively perform 1-step neural inferences and chain together the results to generate a multi-step reasoning trace. Several works have adapted variants of this central idea and reported significantly high accuracies compared to vanilla LLM’s. In this paper, we perform a critical empirical investigation of the chaining approach on a multi-hop First-Order Logic (FOL) reasoning benchmark. In particular, we develop a reference implementation, called Chainformer, and conduct several experiments to analyze the accuracy, generalization, interpretability, and performance over FOLs. Our findings highlight key strengths and possible current limitations and suggest potential areas for future research in logic reasoning.
Anthology ID:
2023.findings-acl.588
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9240–9253
Language:
URL:
https://aclanthology.org/2023.findings-acl.588
DOI:
10.18653/v1/2023.findings-acl.588
Bibkey:
Cite (ACL):
Kanagasabai Rajaraman, Saravanan Rajamanickam, and Wei Shi. 2023. Investigating Transformer-Guided Chaining for Interpretable Natural Logic Reasoning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 9240–9253, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Investigating Transformer-Guided Chaining for Interpretable Natural Logic Reasoning (Rajaraman et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.588.pdf
Video:
 https://aclanthology.org/2023.findings-acl.588.mp4