Paper ID | MLSP-13.3 |
Paper Title |
OPTIMAL IMPORTANCE SAMPLING FOR FEDERATED LEARNING |
Authors |
Elsa Rizk, Stefan Vlaski, Ali H. Sayed, Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland |
Session | MLSP-13: Federated Learning 2 |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation |
Poster
|
Topic |
Machine Learning for Signal Processing: [MLR-DFED] Distributed/Federated learning |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Federated learning involves a mixture of centralized and decentralized processing tasks, where a server regularly selects a sample of the agents, and these in turn sample their local data to compute stochastic gradients for their learning updates. The sampling of both agents and data is generally uniform; however, in this work we consider non-uniform sampling. We derive optimal importance sampling strategies for both agent and data selection and show that under convexity and Lipschitz assumptions, non-uniform sampling without replacement improves the performance of the original FedAvg algorithm. We run experiments on a regression and classification problem to illustrate the theoretical results. |