Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference

Yong-Ho Jung, Jun-Hyung Park, Joon-Young Choi, Mingyu Lee, Junho Kim, Kang-Min Kim, SangKeun Lee


Abstract
Commonsense inference poses a unique challenge to reason and generate the physical, social, and causal conditions of a given event. Existing approaches to commonsense inference utilize commonsense transformers, which are large-scale language models that learn commonsense knowledge graphs. However, they suffer from a lack of coverage and expressive diversity of the graphs, resulting in a degradation of the representation quality. In this paper, we focus on addressing missing relations in commonsense knowledge graphs, and propose a novel contrastive learning framework called SOLAR. Our framework contrasts sets of semantically similar and dissimilar events, learning richer inferential knowledge compared to existing approaches. Empirical results demonstrate the efficacy of SOLAR in commonsense inference of diverse commonsense knowledge graphs. Specifically, SOLAR outperforms the state-of-the-art commonsense transformer on commonsense inference with ConceptNet by 1.84% on average among 8 automatic evaluation metrics. In-depth analysis of SOLAR sheds light on the effects of the missing relations utilized in learning commonsense knowledge graphs.
Anthology ID:
2022.findings-acl.119
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1514–1523
Language:
URL:
https://aclanthology.org/2022.findings-acl.119
DOI:
10.18653/v1/2022.findings-acl.119
Bibkey:
Cite (ACL):
Yong-Ho Jung, Jun-Hyung Park, Joon-Young Choi, Mingyu Lee, Junho Kim, Kang-Min Kim, and SangKeun Lee. 2022. Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1514–1523, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference (Jung et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.119.pdf
Video:
 https://aclanthology.org/2022.findings-acl.119.mp4
Code
 yongho94/solar-framework_commonsense-inference
Data
ConceptNetEvent2Mind