Embedding WordNet Knowledge for Textual Entailment

Yunshi Lan, Jing Jiang


Abstract
In this paper, we study how we can improve a deep learning approach to textual entailment by incorporating lexical entailment relations from WordNet. Our idea is to embed the lexical entailment knowledge contained in WordNet in specially-learned word vectors, which we call “entailment vectors.” We present a standard neural network model and a novel set-theoretic model to learn these entailment vectors from word pairs with known lexical entailment relations derived from WordNet. We further incorporate these entailment vectors into a decomposable attention model for textual entailment and evaluate the model on the SICK and the SNLI dataset. We find that using these special entailment word vectors, we can significantly improve the performance of textual entailment compared with a baseline that uses only standard word2vec vectors. The final performance of our model is close to or above the state of the art, but our method does not rely on any manually-crafted rules or extensive syntactic features.
Anthology ID:
C18-1023
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
270–281
Language:
URL:
https://aclanthology.org/C18-1023
DOI:
Bibkey:
Cite (ACL):
Yunshi Lan and Jing Jiang. 2018. Embedding WordNet Knowledge for Textual Entailment. In Proceedings of the 27th International Conference on Computational Linguistics, pages 270–281, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Embedding WordNet Knowledge for Textual Entailment (Lan & Jiang, COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1023.pdf
Data
SICKSNLI