Repurformer: Transformers for Repurposing-Aware Molecule Generation

Changhun Lee, Gyumin Lee


Abstract
Generating as diverse molecules as possible with desired properties is crucial for drug discovery research, which invokes many approaches based on deep generative models today. Despite recent advancements in these models, particularly in variational autoencoders (VAEs), generative adversarial networks (GANs), Transformers, and diffusion models, a significant challenge known as the sample bias problem remains. This problem occurs when generated molecules targeting the same protein tend to be structurally similar, reducing the diversity of generation. To address this, we propose leveraging multi-hop relationships among proteins and compounds. Our model, Repurformer, integrates bi-directional pretraining with Fast Fourier Transform (FFT) and low-pass filtering (LPF) to capture complex interactions and generate diverse molecules. A series of experiments on BindingDB dataset confirm that Repurformer successfully creates substitutes for anchor compounds that resemble positive compounds, increasing diversity between the anchor and generated compounds.
Anthology ID:
2024.langmol-1.14
Volume:
Proceedings of the 1st Workshop on Language + Molecules (L+M 2024)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Carl Edwards, Qingyun Wang, Manling Li, Lawrence Zhao, Tom Hope, Heng Ji
Venues:
LangMol | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
116–127
Language:
URL:
https://aclanthology.org/2024.langmol-1.14
DOI:
Bibkey:
Cite (ACL):
Changhun Lee and Gyumin Lee. 2024. Repurformer: Transformers for Repurposing-Aware Molecule Generation. In Proceedings of the 1st Workshop on Language + Molecules (L+M 2024), pages 116–127, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Repurformer: Transformers for Repurposing-Aware Molecule Generation (Lee & Lee, LangMol-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.langmol-1.14.pdf