Generalized Quantifiers as a Source of Error in Multilingual NLU Benchmarks

Ruixiang Cui, Daniel Hershcovich, Anders Søgaard


Abstract
Logical approaches to representing language have developed and evaluated computational models of quantifier words since the 19th century, but today’s NLU models still struggle to capture their semantics. We rely on Generalized Quantifier Theory for language-independent representations of the semantics of quantifier words, to quantify their contribution to the errors of NLU models. We find that quantifiers are pervasive in NLU benchmarks, and their occurrence at test time is associated with performance drops. Multilingual models also exhibit unsatisfying quantifier reasoning abilities, but not necessarily worse for non-English languages. To facilitate directly-targeted probing, we present an adversarial generalized quantifier NLI task (GQNLI) and show that pre-trained language models have a clear lack of robustness in generalized quantifier reasoning.
Anthology ID:
2022.dadc-1.7
Volume:
Proceedings of the First Workshop on Dynamic Adversarial Data Collection
Month:
July
Year:
2022
Address:
Seattle, WA
Editors:
Max Bartolo, Hannah Kirk, Pedro Rodriguez, Katerina Margatina, Tristan Thrush, Robin Jia, Pontus Stenetorp, Adina Williams, Douwe Kiela
Venue:
DADC
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
61–61
Language:
URL:
https://aclanthology.org/2022.dadc-1.7
DOI:
10.18653/v1/2022.dadc-1.7
Bibkey:
Cite (ACL):
Ruixiang Cui, Daniel Hershcovich, and Anders Søgaard. 2022. Generalized Quantifiers as a Source of Error in Multilingual NLU Benchmarks. In Proceedings of the First Workshop on Dynamic Adversarial Data Collection, pages 61–61, Seattle, WA. Association for Computational Linguistics.
Cite (Informal):
Generalized Quantifiers as a Source of Error in Multilingual NLU Benchmarks (Cui et al., DADC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.dadc-1.7.pdf
Video:
 https://aclanthology.org/2022.dadc-1.7.mp4
Code
 ruixiangcui/gqnli
Data
ANLIMLQAMultiNLISNLITaxiNLIXNLIXQuAD