Improved Contrastive Learning over Commonsense Knowledge Graphs for Unsupervised Reasoning

Rongwen Zhao, Jeffrey Flanigan


Abstract
Knowledge-augmented methods leverage external resources such as commonsense knowledge graphs (CSKGs) to improve downstream reasoning tasks. Recent work has explored contrastive learning over relation-aware sequence pairs derived from CSKG triples to inject commonsense knowledge into pre-trained language models (PLMs). However, existing approaches suffer from two key limitations: they rely solely on randomly sampled in-batch negatives, overlooking more informative hard negatives, and they ignore additional plausible positives that could strengthen training. Both factors limit the effectiveness of contrastive knowledge learning. In this paper, we propose an enhanced contrastive learning framework for CSKGs that integrates hard negative sampling and positive set expansion. Hard negatives are dynamically selected based on semantic similarity to ensure the model learns from challenging distinctions, while positive set expansion exploits the property that similar head entities often share overlapping tail entities, allowing the recovery of missing positives. We evaluate our method on unsupervised commonsense question answering and inductive CSKG completion using ConceptNet and ATOMIC. Experimental results demonstrate consistent improvements over strong baselines, confirming that our approach yields richer commonsense-aware representations and more effective knowledge injection into PLMs.
Anthology ID:
2025.r2lm-1.17
Volume:
Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Alicia Picazo-Izquierdo, Ernesto Luis Estevanell-Valladares, Ruslan Mitkov, Rafael Muñoz Guillena, Raúl García Cerdá
Venues:
R2LM | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
165–178
Language:
URL:
https://aclanthology.org/2025.r2lm-1.17/
DOI:
Bibkey:
Cite (ACL):
Rongwen Zhao and Jeffrey Flanigan. 2025. Improved Contrastive Learning over Commonsense Knowledge Graphs for Unsupervised Reasoning. In Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models, pages 165–178, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Improved Contrastive Learning over Commonsense Knowledge Graphs for Unsupervised Reasoning (Zhao & Flanigan, R2LM 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.r2lm-1.17.pdf