CIKQA: Learning Commonsense Inference with a Unified Knowledge-in-the-loop QA Paradigm

Hongming Zhang, Yintong Huo, Yanai Elazar, Yangqiu Song, Yoav Goldberg, Dan Roth


Abstract
We propose a new commonsense reasoning benchmark to motivate commonsense reasoning progress from two perspectives: (1) Evaluating whether models can distinguish knowledge quality by predicting if the knowledge is enough to answer the question; (2) Evaluating whether models can develop commonsense inference capabilities that generalize across tasks. We first extract supporting knowledge for each question and ask humans to annotate whether the auto-extracted knowledge is enough to answer the question or not. After that, we convert different tasks into a unified question-answering format to evaluate the models’ generalization capabilities. We name the benchmark Commonsense Inference with Knowledge-in-the-loop Question Answering (\name). Experiments show that with our learning paradigm, models demonstrate encouraging generalization capabilities. At the same time, we also notice that distinguishing knowledge quality remains challenging for current commonsense reasoning models.
Anthology ID:
2023.findings-eacl.8
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
114–124
Language:
URL:
https://aclanthology.org/2023.findings-eacl.8
DOI:
10.18653/v1/2023.findings-eacl.8
Bibkey:
Cite (ACL):
Hongming Zhang, Yintong Huo, Yanai Elazar, Yangqiu Song, Yoav Goldberg, and Dan Roth. 2023. CIKQA: Learning Commonsense Inference with a Unified Knowledge-in-the-loop QA Paradigm. In Findings of the Association for Computational Linguistics: EACL 2023, pages 114–124, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
CIKQA: Learning Commonsense Inference with a Unified Knowledge-in-the-loop QA Paradigm (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.8.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.8.mp4