MSG-LLM: A Multi-scale Interactive Framework for Graph-enhanced Large Language Models

Jiayu Ding, Zhangkai Zheng, Benshuo Lin, Yun Xue, Yiping Song


Abstract
Graph-enhanced large language models (LLMs) leverage LLMs’ remarkable ability to model language and use graph structures to capture topological relationships. Existing graph-enhanced LLMs typically retrieve similar subgraphs to augment LLMs, where the subgraphs carry the entities related to our target and relations among the entities. However, the retrieving methods mainly focus solely on accurately matching subgraphs between our target subgraph and the candidate subgraphs at the same scale, neglecting that the subgraphs with different scales may also share similar semantics or structures. To tackle this challenge, we introduce a graph-enhanced LLM with multi-scale retrieval (MSG-LLM). It captures similar graph structures and semantics across graphs at different scales and bridges the graph alignment across multiple scales. The larger scales maintain the graph’s global information, while the smaller scales preserve the details of fine-grained sub-structures. Specifically, we construct a multi-scale variation to dynamically shrink the scale of graphs. Further, we employ a graph kernel search to discover subgraphs from the entire graph, which essentially achieves multi-scale graph retrieval in Hilbert space. Additionally, we propose to conduct multi-scale interactions (message passing) over graphs at various scales to integrate key information. The interaction also bridges the graph and LLMs, helping with graph retrieval and LLM generation. Finally, we employ a Chain-of-Thought-based LLM prediction to perform the downstream tasks. We evaluate our approach on two graph-based downstream tasks and the experimental results show that our method achieves state-of-the-art performance.
Anthology ID:
2025.coling-main.648
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9687–9700
Language:
URL:
https://aclanthology.org/2025.coling-main.648/
DOI:
Bibkey:
Cite (ACL):
Jiayu Ding, Zhangkai Zheng, Benshuo Lin, Yun Xue, and Yiping Song. 2025. MSG-LLM: A Multi-scale Interactive Framework for Graph-enhanced Large Language Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 9687–9700, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
MSG-LLM: A Multi-scale Interactive Framework for Graph-enhanced Large Language Models (Ding et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.648.pdf