Hy-NLI: a Hybrid system for Natural Language Inference

Aikaterini-Lida Kalouli, Richard Crouch, Valeria de Paiva


Abstract
Despite the advances in Natural Language Inference through the training of massive deep models, recent work has revealed the generalization difficulties of such models, which fail to perform on adversarial datasets with challenging linguistic phenomena. Such phenomena, however, can be handled well by symbolic systems. Thus, we propose Hy-NLI, a hybrid system that learns to identify an NLI pair as linguistically challenging or not. Based on that, it uses its symbolic or deep learning component, respectively, to make the final inference decision. We show how linguistically less complex cases are best solved by robust state-of-the-art models, like BERT and XLNet, while hard linguistic phenomena are best handled by our implemented symbolic engine. Our thorough evaluation shows that our hybrid system achieves state-of-the-art performance across mainstream and adversarial datasets and opens the way for further research into the hybrid direction.
Anthology ID:
2020.coling-main.459
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5235–5249
Language:
URL:
https://aclanthology.org/2020.coling-main.459
DOI:
10.18653/v1/2020.coling-main.459
Bibkey:
Cite (ACL):
Aikaterini-Lida Kalouli, Richard Crouch, and Valeria de Paiva. 2020. Hy-NLI: a Hybrid system for Natural Language Inference. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5235–5249, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Hy-NLI: a Hybrid system for Natural Language Inference (Kalouli et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.459.pdf
Code
 kkalouli/gkr4nli +  additional community code
Data
MultiNLISICKSNLI