Paper ID | MLSP-13.2 |
Paper Title |
GRADUAL FEDERATED LEARNING USING SIMULATED ANNEALING |
Authors |
Luong Trung Nguyen, Byonghyo Shim, Seoul National University, South Korea |
Session | MLSP-13: Federated Learning 2 |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation Time: | Wednesday, 09 June, 13:00 - 13:45 |
Presentation |
Poster
|
Topic |
Machine Learning for Signal Processing: [MLR-DFED] Distributed/Federated learning |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Federated learning is a machine learning framework that enables AI models training over a network of multiple user devices without revealing user data stored in the devices. Popularly used federated learning technique to enhance the learning performance of user devices is to globally evaluate the learning model at the server by averaging the locally trained models of the devices.~This global model is then sent back to the user devices so that every device applies it in the next training iteration. However, this average-based model is not always better than the local update model of a user device. In this work, we put forth a new update strategy based on the simulated annealing (SA) algorithm, in which the user devices choose their training parameters between the global evaluation model and their local models probabilistically. The proposed technique, dubbed {\it simulated annealing-based federated learning} (SAFL), is effective in solving a wide class of federated learning problems. From numerical experiments, we demonstrate that SAFL outperforms the conventional approach on different benchmark datasets, achieving an accuracy improvement of 50\% in a few iterations. |