FEDKIM: Adaptive Federated Knowledge Injection into Medical Foundation Models

Xiaochen Wang, Jiaqi Wang, Houping Xiao, Jinghui Chen, Fenglong Ma


Abstract
Foundation models have demonstrated remarkable capabilities in handling diverse modalities and tasks, outperforming conventional artificial intelligence (AI) approaches that are highly task-specific and modality-reliant. In the medical domain, however, the development of comprehensive foundation models is constrained by limited access to diverse modalities and stringent privacy regulations. To address these constraints, this study introduces a novel knowledge injection approach, FedKIM, designed to scale the medical foundation model within a federated learning framework. FedKIM leverages lightweight local models to extract healthcare knowledge from private data and integrates this knowledge into a centralized foundation model using a designed adaptive Multitask Multimodal Mixture Of Experts (M3OE) module. This method not only preserves privacy but also enhances the model’s ability to handle complex medical tasks involving multiple modalities. Our extensive experiments across twelve tasks in seven modalities demonstrate the effectiveness of FedKIM in various settings, highlighting its potential to scale medical foundation models without direct access to sensitive data. Source codes are available at https://github.com/XiaochenWang-PSU/FedKIM.
Anthology ID:
2024.emnlp-main.464
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8141–8154
Language:
URL:
https://aclanthology.org/2024.emnlp-main.464
DOI:
Bibkey:
Cite (ACL):
Xiaochen Wang, Jiaqi Wang, Houping Xiao, Jinghui Chen, and Fenglong Ma. 2024. FEDKIM: Adaptive Federated Knowledge Injection into Medical Foundation Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 8141–8154, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
FEDKIM: Adaptive Federated Knowledge Injection into Medical Foundation Models (Wang et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.464.pdf
Software:
 2024.emnlp-main.464.software.zip
Data:
 2024.emnlp-main.464.data.zip