FedLFC: Towards Efficient Federated Multilingual Modeling with LoRA-based Language Family Clustering

Zhihan Guo, Yifei Zhang, Zhuo Zhang, Zenglin Xu, Irwin King


Abstract
Federated Multilingual Modeling (FMM) plays a crucial role in the applications of natural language processing due to the increasing diversity of languages and the growing demand for data privacy. However, FMM faces limitations stemming from (1) the substantial communication costs in networking and (2) the conflicts arising from parameter interference between different languages. To address these challenges, we introduce a communication-efficient federated learning framework with low-rank adaptation and language family clustering for Multilingual Modeling (MM). In this framework, we maintain the weights of the base model, exclusively updating the lightweight Low-rank adaptation (LoRA) parameters to minimize communication costs. Additionally, we mitigate parameter conflicts by grouping languages based on their language family affiliations, as opposed to aggregating all LoRA parameters. Experiments demonstrate that our proposed model not only surpasses the baseline models in performance but also reduces the communication overhead. Our code is available at https://github.com/zhihan-guo/FedLFC.
Anthology ID:
2024.findings-naacl.98
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1519–1528
Language:
URL:
https://aclanthology.org/2024.findings-naacl.98
DOI:
Bibkey:
Cite (ACL):
Zhihan Guo, Yifei Zhang, Zhuo Zhang, Zenglin Xu, and Irwin King. 2024. FedLFC: Towards Efficient Federated Multilingual Modeling with LoRA-based Language Family Clustering. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 1519–1528, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
FedLFC: Towards Efficient Federated Multilingual Modeling with LoRA-based Language Family Clustering (Guo et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.98.pdf
Copyright:
 2024.findings-naacl.98.copyright.pdf