Chang Wu
2024
MolTC: Towards Molecular Relational Modeling In Language Models
Junfeng Fang
|
Shuai Zhang
|
Chang Wu
|
Zhengyi Yang
|
Zhiyuan Liu
|
Sihang Li
|
Kun Wang
|
Wenjie Du
|
Xiang Wang
Findings of the Association for Computational Linguistics: ACL 2024
Molecular Relational Learning (MRL), aiming to understand interactions between molecular pairs, plays a pivotal role in advancing biochemical research. Recently, the adoption of large language models (LLMs), known for their vast knowledge repositories and advanced logical inference capabilities, has emerged as a promising way for efficient and effective MRL. Despite their potential, these methods predominantly rely on textual data, thus not fully harnessing the wealth of structural information inherent in molecular graphs. Moreover, the absence of a unified framework exacerbates the issue of insufficient data exploitation, as it hinders the sharing of interaction mechanism learned across various datasets. To address these challenges, this work proposes a novel LLM-based multi-modal framework for molecular interaction modeling following Chain-of-Thought (CoT) theory, termed MolTC, which effectively integrate graphical information of two molecules in pair. To train this integrated framework efficiently, we introduce a *multi-hierarchical CoT theory* to refine its training paradigm, and conduct a comprehensive *Molecular Interactive Instructions* dataset for the development of biochemical LLMs involving MRL.Our experiments,conducted across various datasets involving over 4,000,000 molecular pairs, exhibit the superiority of our method over current GNN and LLM-based baselines. Code is available at https://github.com/MangoKiller/MolTC.
Search
Co-authors
- Junfeng Fang 1
- Shuai Zhang 1
- Zhengyi Yang 1
- Zhiyuan Liu 1
- Sihang Li 1
- show all...