Client-Customized Adaptation for Parameter-Efficient Federated Learning

Yeachan Kim, Junho Kim, Wing-Lam Mok, Jun-Hyung Park, SangKeun Lee


Abstract
Despite the versatility of pre-trained language models (PLMs) across domains, their large memory footprints pose significant challenges in federated learning (FL), where the training model has to be distributed between a server and clients. One potential solution to bypass such constraints might be the use of parameter-efficient fine-tuning (PEFT) in the context of FL. However, we have observed that typical PEFT tends to severely suffer from heterogeneity among clients in FL scenarios, resulting in unstable and slow convergence. In this paper, we propose Client-Customized Adaptation (C2A), a novel hypernetwork-based FL framework that generates client-specific adapters by conditioning the client information. With the effectiveness of the hypernetworks in generating customized weights through learning to adopt the different characteristics of inputs, C2A can maximize the utility of shared model parameters while minimizing the divergence caused by client heterogeneity. To verify the efficacy of C2A, we perform extensive evaluations on FL scenarios involving heterogeneity in label and language distributions. Comprehensive evaluation results clearly support the superiority of C2A in terms of both efficiency and effectiveness in FL scenarios.
Anthology ID:
2023.findings-acl.75
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1159–1172
Language:
URL:
https://aclanthology.org/2023.findings-acl.75
DOI:
10.18653/v1/2023.findings-acl.75
Bibkey:
Cite (ACL):
Yeachan Kim, Junho Kim, Wing-Lam Mok, Jun-Hyung Park, and SangKeun Lee. 2023. Client-Customized Adaptation for Parameter-Efficient Federated Learning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1159–1172, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Client-Customized Adaptation for Parameter-Efficient Federated Learning (Kim et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.75.pdf
Video:
 https://aclanthology.org/2023.findings-acl.75.mp4