Investigating the Performance of Transformer-Based NLI Models on Presuppositional Inferences

Jad Kabbara, Jackie Chi Kit Cheung


Abstract
Presuppositions are assumptions that are taken for granted by an utterance, and identifying them is key to a pragmatic interpretation of language. In this paper, we investigate the capabilities of transformer models to perform NLI on cases involving presupposition. First, we present simple heuristics to create alternative “contrastive” test cases based on the ImpPres dataset and investigate the model performance on those test cases. Second, to better understand how the model is making its predictions, we analyze samples from sub-datasets of ImpPres and examine model performance on them. Overall, our findings suggest that NLI-trained transformer models seem to be exploiting specific structural and lexical cues as opposed to performing some kind of pragmatic reasoning.
Anthology ID:
2022.coling-1.65
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
779–785
Language:
URL:
https://aclanthology.org/2022.coling-1.65
DOI:
Bibkey:
Cite (ACL):
Jad Kabbara and Jackie Chi Kit Cheung. 2022. Investigating the Performance of Transformer-Based NLI Models on Presuppositional Inferences. In Proceedings of the 29th International Conference on Computational Linguistics, pages 779–785, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Investigating the Performance of Transformer-Based NLI Models on Presuppositional Inferences (Kabbara & Cheung, COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.65.pdf
Data
IMPPRESMultiNLI