Improving Unsupervised Commonsense Reasoning Using Knowledge-Enabled Natural Language Inference

Canming Huang, Weinan He, Yongmei Liu


Abstract
Recent methods based on pre-trained language models have shown strong supervised performance on commonsense reasoning. However, they rely on expensive data annotation and time-consuming training. Thus, we focus on unsupervised commonsense reasoning. We show the effectiveness of using a common framework, Natural Language Inference (NLI), to solve diverse commonsense reasoning tasks. By leveraging transfer learning from large NLI datasets, and injecting crucial knowledge from commonsense sources such as ATOMIC 2020 and ConceptNet, our method achieved state-of-the-art unsupervised performance on two commonsense reasoning tasks: WinoWhy and CommonsenseQA. Further analysis demonstrated the benefits of multiple categories of knowledge, but problems about quantities and antonyms are still challenging.
Anthology ID:
2021.findings-emnlp.420
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4875–4885
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.420
DOI:
10.18653/v1/2021.findings-emnlp.420
Bibkey:
Cite (ACL):
Canming Huang, Weinan He, and Yongmei Liu. 2021. Improving Unsupervised Commonsense Reasoning Using Knowledge-Enabled Natural Language Inference. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4875–4885, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving Unsupervised Commonsense Reasoning Using Knowledge-Enabled Natural Language Inference (Huang et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.420.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.420.mp4
Data
ATOMICCommonsenseQAConceptNetGLUEMultiNLIQNLIWSCWinoGrandeWinoWhy