Bridging Knowledge Gaps in Neural Entailment via Symbolic Models

Dongyeop Kang, Tushar Khot, Ashish Sabharwal, Peter Clark


Abstract
Most textual entailment models focus on lexical gaps between the premise text and the hypothesis, but rarely on knowledge gaps. We focus on filling these knowledge gaps in the Science Entailment task, by leveraging an external structured knowledge base (KB) of science facts. Our new architecture combines standard neural entailment models with a knowledge lookup module. To facilitate this lookup, we propose a fact-level decomposition of the hypothesis, and verifying the resulting sub-facts against both the textual premise and the structured KB. Our model, NSNet, learns to aggregate predictions from these heterogeneous data formats. On the SciTail dataset, NSNet outperforms a simpler combination of the two predictions by 3% and the base entailment model by 5%.
Anthology ID:
D18-1535
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4940–4945
Language:
URL:
https://aclanthology.org/D18-1535
DOI:
10.18653/v1/D18-1535
Bibkey:
Cite (ACL):
Dongyeop Kang, Tushar Khot, Ashish Sabharwal, and Peter Clark. 2018. Bridging Knowledge Gaps in Neural Entailment via Symbolic Models. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4940–4945, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Bridging Knowledge Gaps in Neural Entailment via Symbolic Models (Kang et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1535.pdf
Attachment:
 D18-1535.Attachment.pdf