GraphMind: Interactive Novelty Assessment System for Accelerating Scientific Discovery

Italo Luis da Silva, Hanqi Yan, Lin Gui, Yulan He


Abstract
Large Language Models (LLMs) show strong reasoning and text generation capabilities, prompting their use in scientific literature analysis, including novelty assessment. While evaluating novelty of scientific papers is crucial for peer review, it requires extensive knowledge of related work, something not all reviewers have.While recent work on LLM-assisted scientific literature analysis supports literature comparison, existing approaches offer limited transparency and lack mechanisms for result traceability via an information retrieval module. To address this gap, we introduce GraphMind, an easy-to-use interactive web tool designed to assist users in evaluating the novelty of scientific papers or drafted ideas. Specially, GraphMind enables users to capture the main structure of a scientific paper, explore related ideas through various perspectives, and assess novelty via providing verifiable contextual insights. GraphMind enables users to annotate key elements of a paper, explore related papers through various relationships, and assess novelty with contextual insight. This tool integrates external APIs such as arXiv and Semantic Scholar with LLMs to support annotation, extraction, retrieval and classification of papers. This combination provides users with a rich, structured view of a scientific idea’s core contributions and its connections to existing work. GraphMind is available at https://oyarsa.github.io/graphmind and a demonstration video at https://youtu.be/wKbjQpSvwJg.
Anthology ID:
2025.emnlp-demos.21
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Ivan Habernal, Peter Schulam, Jörg Tiedemann
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
286–294
Language:
URL:
https://aclanthology.org/2025.emnlp-demos.21/
DOI:
Bibkey:
Cite (ACL):
Italo Luis da Silva, Hanqi Yan, Lin Gui, and Yulan He. 2025. GraphMind: Interactive Novelty Assessment System for Accelerating Scientific Discovery. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 286–294, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
GraphMind: Interactive Novelty Assessment System for Accelerating Scientific Discovery (da Silva et al., EMNLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.emnlp-demos.21.pdf