Yunjie He


2024

pdf bib
Predictive Multiplicity of Knowledge Graph Embeddings in Link Prediction
Yuqicheng Zhu | Nico Potyka | Mojtaba Nayyeri | Bo Xiong | Yunjie He | Evgeny Kharlamov | Steffen Staab
Findings of the Association for Computational Linguistics: EMNLP 2024

Knowledge graph embedding (KGE) models are often used to predict missing links for knowledge graphs (KGs). However, multiple KG embeddings can perform almost equally well for link prediction yet give conflicting predictions for unseen queries. This phenomenon is termed predictive multiplicity in the literature. It poses substantial risks for KGE-based applications in high-stake domains but has been overlooked in KGE research. We define predictive multiplicity in link prediction, introduce evaluation metrics and measure predictive multiplicity for representative KGE methods on commonly used benchmark datasets. Our empirical study reveals significant predictive multiplicity in link prediction, with 8% to 39% testing queries exhibiting conflicting predictions. We address this issue by leveraging voting methods from social choice theory, significantly mitigating conflicts by 66% to 78% in our experiments.