Foundation Model for Biomedical Graphs: Integrating Knowledge Graphs and Protein Structures to Large Language Models

Yunsoo Kim


Abstract
Transformer model has been a de-facto standard in natural language processing. Its adaptations in other fields such as computer vision showed promising results that this architecture is a powerful neural network in representation learning regardless of the data type. This recent success has led to research in multimodal Large Language Model (LLM), which enabled us to new types of tasks and applications with multiple data types. However, multimodal LLM in the biomedical domain is primarily limited to images, text, and/or sequence data. Here I propose to work on multimodal LLM architecture for biomedical graphs such as protein structure and chemical molecules. The research hypothesis is based on the fact that clinicians and researchers in computational biology and clinical research take advantage of various information for their decision-making process. Therefore, an AI model being able to handle multiple data types should boost its ability to use diverse knowledge for improved performances in clinical applications.
Anthology ID:
2024.acl-srw.30
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Xiyan Fu, Eve Fleisig
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
346–355
Language:
URL:
https://aclanthology.org/2024.acl-srw.30
DOI:
Bibkey:
Cite (ACL):
Yunsoo Kim. 2024. Foundation Model for Biomedical Graphs: Integrating Knowledge Graphs and Protein Structures to Large Language Models. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 346–355, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Foundation Model for Biomedical Graphs: Integrating Knowledge Graphs and Protein Structures to Large Language Models (Kim, ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-srw.30.pdf