Knowledge Graph Unlearning with Schema

Yang Xiao, Ruimeng Ye, Bo Hui


Abstract
Graph unlearning emerges as a crucial step to eliminate the impact of deleted elements from a trained model. However, unlearning on the knowledge graph (KG) has not yet been extensively studied. We remark that KG unlearning is non-trivial because KG is distinctive from general graphs. In this paper, we first propose a new unlearning method based on schema for KG. Specifically, we update the representation of the deleted element’s neighborhood with an unlearning object that regulates the affinity between the affected neighborhood and the instances within the same schema. Second, we raise a new task: schema unlearning. Given a schema graph to be deleted, we remove all instances matching the pattern and make the trained model forget the removed instances. Last, we evaluate the proposed unlearning method on various KG embedding models with benchmark datasets. Our codes are available at https://github.com/NKUShaw/KGUnlearningBySchema.
Anthology ID:
2025.coling-main.238
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3541–3546
Language:
URL:
https://aclanthology.org/2025.coling-main.238/
DOI:
Bibkey:
Cite (ACL):
Yang Xiao, Ruimeng Ye, and Bo Hui. 2025. Knowledge Graph Unlearning with Schema. In Proceedings of the 31st International Conference on Computational Linguistics, pages 3541–3546, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Knowledge Graph Unlearning with Schema (Xiao et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.238.pdf