CRoW: Benchmarking Commonsense Reasoning in Real-World Tasks

Mete Ismayilzada, Debjit Paul, Syrielle Montariol, Mor Geva, Antoine Bosselut


Abstract
Recent efforts in natural language processing (NLP) commonsense reasoning research have yielded a considerable number of new datasets and benchmarks. However, most of these datasets formulate commonsense reasoning challenges in artificial scenarios that are not reflective of the tasks which real-world NLP systems are designed to solve. In this work, we present CRoW, a manually-curated, multi-task benchmark that evaluates the ability of models to apply commonsense reasoning in the context of six real-world NLP tasks. CRoW is constructed using a multi-stage data collection pipeline that rewrites examples from existing datasets using commonsense-violating perturbations. We use CRoW to study how NLP systems perform across different dimensions of commonsense knowledge, such as physical, temporal, and social reasoning. We find a significant performance gap when NLP systems are evaluated on CRoW compared to humans, showcasing that commonsense reasoning is far from being solved in real-world task settings. We make our dataset and leaderboard available to the research community.
Anthology ID:
2023.emnlp-main.607
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9785–9821
Language:
URL:
https://aclanthology.org/2023.emnlp-main.607
DOI:
10.18653/v1/2023.emnlp-main.607
Bibkey:
Cite (ACL):
Mete Ismayilzada, Debjit Paul, Syrielle Montariol, Mor Geva, and Antoine Bosselut. 2023. CRoW: Benchmarking Commonsense Reasoning in Real-World Tasks. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 9785–9821, Singapore. Association for Computational Linguistics.
Cite (Informal):
CRoW: Benchmarking Commonsense Reasoning in Real-World Tasks (Ismayilzada et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.607.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.607.mp4