Streamlining Biomedical Research with Specialized LLMs

Linqing Chen


Abstract
In this paper, we propose a novel system that integrates state-of-the-art, domain-specific large language models with advanced information retrieval techniques to deliver comprehensive and context-aware responses. Our approach facilitates seamless interaction among diverse components, enabling cross-validation of outputs to produce accurate, high-quality responses enriched with relevant data, images, tables, and other modalities. We demonstrate the system’s capability to enhance response precision by leveraging a robust question-answering model, significantly improving the quality of dialogue generation.The system provides an accessible platform for real-time, high-fidelity interactions, allowing users to benefit from efficient human-computer interaction, precise retrieval, and simultaneous access to a wide range of literature and data. This dramatically improves the research efficiency of professionals in the biomedical and pharmaceutical domains and facilitates faster, more informed decision-making throughout the R&D process. Furthermore, the system proposed in this paper is available at https://synapse-chat.patsnap.com.
Anthology ID:
2025.coling-demos.2
Volume:
Proceedings of the 31st International Conference on Computational Linguistics: System Demonstrations
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert, Brodie Mather, Mark Dras
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9–19
Language:
URL:
https://aclanthology.org/2025.coling-demos.2/
DOI:
Bibkey:
Cite (ACL):
Linqing Chen. 2025. Streamlining Biomedical Research with Specialized LLMs. In Proceedings of the 31st International Conference on Computational Linguistics: System Demonstrations, pages 9–19, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Streamlining Biomedical Research with Specialized LLMs (Chen, COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-demos.2.pdf