Natural Language Deduction with Incomplete Information

Zayne Sprague, Kaj Bostrom, Swarat Chaudhuri, Greg Durrett


Abstract
A growing body of work studies how to answer a question or verify a claim by generating a natural language “proof:” a chain of deductive inferences yielding the answer based on a set of premises. However, these methods can only make sound deductions when they follow from evidence that is given. We propose a new system that can handle the underspecified setting where not all premises are stated at the outset; that is, additional assumptions need to be materialized to prove a claim. By using a natural language generation model to abductively infer a premise given another premise and a conclusion, we can impute missing pieces of evidence needed for the conclusion to be true. Our system searches over two fringes in a bidirectional fashion, interleaving deductive (forward-chaining) and abductive (backward-chaining) generation steps. We sample multiple possible outputs for each step to achieve coverage of the search space, at the same time ensuring correctness by filtering low-quality generations with a round-trip validation procedure. Results on a modified version of the EntailmentBank dataset and a new dataset called Everyday Norms: Why Not? Show that abductive generation with validation can recover premises across in- and out-of-domain settings.
Anthology ID:
2022.emnlp-main.564
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8230–8258
Language:
URL:
https://aclanthology.org/2022.emnlp-main.564
DOI:
10.18653/v1/2022.emnlp-main.564
Bibkey:
Cite (ACL):
Zayne Sprague, Kaj Bostrom, Swarat Chaudhuri, and Greg Durrett. 2022. Natural Language Deduction with Incomplete Information. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 8230–8258, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Natural Language Deduction with Incomplete Information (Sprague et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.564.pdf