Sine Cosine Algorithm for Reducing Communication Costs of Federated Learning
Document Type
Conference Proceeding
Publication Title
2022 IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2022
Abstract
Federated Learning (FL) is a Machine Learning (ML) setting in which several clients (e.g., mobile devices) train a model cooperatively under the direction of a central server (e.g., cloud server), while training data is decentralized. Due to the fact that FL clients frequently have restricted transmission capacity, communication among clients and servers needs to be reduced to enhance presentation. FL clients frequently employ Wi-Fi and must interact in Unstable Network Environments (UNE). Existing FL aggregation techniques send and receive a huge number of weights, which dramatically reduces the accuracy of the UNE. In this paper, we propose a Federated Sine Cosine Algorithm (FedSCA) to reduce data communication by transferring score principles rather than all client models' weights and utilizing the Sine Cosine Algorithm (SCA) mechanism as a weight updating technique to improve the clients' models. This paper reveals that using FedSCA significantly decreases the quantity of data utilized in network communication and increases the global model's accuracy by an average of 9.87% over FedAvg and 2.29% over Federated Particle Swarm Optimization (FedPSO). Moreover, in studies conducted on an unstable network, it demonstrated a 4.3% improvement in comparison to accuracy loss in existing algorithms.
First Page
55
Last Page
60
DOI
10.1109/MeditCom55741.2022.9928614
Publication Date
11-2-2022
Keywords
Convolutional Neural Network (CNN), Deep Learning, Federated Learning, Optimization, Sine Cosine Algorithm (SCA)
Recommended Citation
A. K. Abasi, M. Aloqaily, M. Guizani and F. Karray, "Sine Cosine Algorithm for Reducing Communication Costs of Federated Learning," 2022 IEEE International Mediterranean Conference on Communications and Networking (MeditCom), 2022, pp. 55-60, doi: 10.1109/MeditCom55741.2022.9928614.
Comments
IR conditions: non-described