Jie Ding
2022
ActPerFL: Active Personalized Federated Learning
Huili Chen
|
Jie Ding
|
Eric Tramel
|
Shuang Wu
|
Anit Kumar Sahu
|
Salman Avestimehr
|
Tao Zhang
Proceedings of the First Workshop on Federated Learning for Natural Language Processing (FL4NLP 2022)
In the context of personalized federated learning (FL), the critical challenge is to balance local model improvement and global model tuning when the personal and global objectives may not be exactly aligned. Inspired by Bayesian hierarchical models, we develop ActPerFL, a self-aware personalized FL method where each client can automatically balance the training of its local personal model and the global model that implicitly contributes to other clients’ training. Such a balance is derived from the inter-client and intra-client uncertainty quantification. Consequently, ActPerFL can adapt to the underlying clients’ heterogeneity with uncertainty-driven local training and model aggregation. With experimental studies on Sent140 and Amazon Alexa audio data, we show that ActPerFL can achieve superior personalization performance compared with the existing counterparts.
Search
Co-authors
- Huili Chen 1
- Eric Tramel 1
- Shuang Wu 1
- Anit Kumar Sahu 1
- Salman Avestimehr 1
- show all...