Shijian Li


2025

pdf bib
Knowledge Graph Pooling and Unpooling for Concept Abstraction
Juan Li | Wen Zhang | Zhiqiang Liu | Mingchen Tu | Mingyang Chen | Ningyu Zhang | Shijian Li
Proceedings of the 31st International Conference on Computational Linguistics

Knowledge graph embedding (KGE) aims to embed entities and relations as vectors in a continuous space and has proven to be effective for KG tasks. Recently, graph neural networks (GNN) based KGEs gain much attention due to their strong capability of encoding complex graph structures. However, most GNN-based KGEs are directly optimized based on the instance triples in KGs, ignoring the latent concepts and hierarchies of the entities. Though some works explicitly inject concepts and hierarchies into models, they are limited to predefined concepts and hierarchies, which are missing in a lot of KGs. Thus in this paper, we propose a novel framework with KG Pooling and unpooling and Contrastive Learning (KGPCL) to abstract and encode the latent concepts for better KG prediction. Specifically, with an input KG, we first construct a U-KG through KG pooling and unpooling. KG pooling abstracts the input graph to a smaller graph as a pooled graph, and KG unpooling recovers the input graph from the pooled graph. Then we model the U-KG with relational KGEs to get the representations of entities and relations for prediction. Finally, we propose the local and global contrastive loss to jointly enhance the representation of entities. Experimental results show that our models outperform the KGE baselines on link prediction task.