Na Dong
2025
Refining Noisy Knowledge Graph with Large Language Models
Na Dong
|
Natthawut Kertkeidkachorn
|
Xin Liu
|
Kiyoaki Shirai
Proceedings of the Workshop on Generative AI and Knowledge Graphs (GenAIK)
Knowledge graphs (KGs) represent structured real-world information composed by triplets of head entity, relation, and tail entity. These graphs can be constructed automatically from text or manually curated. However, regardless of the construction method, KGs often suffer from misinformation, incompleteness, and noise, which hinder their reliability and utility. This study addresses the challenge of noisy KGs, where incorrect or misaligned entities and relations degrade graph quality. Leveraging recent advancements in large language models (LLMs) with strong capabilities across diverse tasks, we explore their potential to detect and refine noise in KGs. Specifically, we propose a novel method, LLM_sim, to enhance the detection and refinement of noisy triples. Our results confirm the effectiveness of this approach in elevating KG quality in noisy environments. Additionally, we apply our proposed method to Knowledge Graph Completion (KGC), a downstream KG task that aims to predict missing links and improve graph completeness. Traditional KGC methods assume that KGs are noise-free, which is unrealistic in practical scenarios. Our experiments analyze the impact of varying noise levels on KGC performance, revealing that LLMs can mitigate noise by identifying and refining incorrect entries, thus enhancing KG quality.
2003
An Effective Combination of Different Order N-grams
Sen Zhang
|
Na Dong
Proceedings of the 17th Pacific Asia Conference on Language, Information and Computation