A High-Performance Federated Learning Aggregation Algorithm Based on Learning Rate Adjustment and Client Sampling

Autor: Yulian Gao, Gehao Lu, Jimei Gao, Jinggang Li
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Mathematics, Vol 11, Iss 20, p 4344 (2023)
Druh dokumentu: article
ISSN: 2227-7390
DOI: 10.3390/math11204344
Popis: Federated learning is a distributed learning framework designed to protect user privacy, widely applied across various domains. However, existing federated learning algorithms face challenges, including slow convergence, significant loss fluctuations during aggregation, and imbalanced client sampling. To address these issues, this paper introduces a high-performance federated learning aggregation algorithm. This algorithm combines a cyclic adaptive learning rate adjustment strategy with client-weighted random sampling, addressing the aforementioned problems. Weighted random sampling assigns client weights based on their sampling frequency, balancing client sampling rates and contributions to enhance model aggregation. Additionally, it adapts the learning rate based on client loss variations and communication rounds, accelerating model convergence and reducing communication costs. To evaluate this high-performance algorithm, experiments are conducted using well-known datasets MNIST and CIFAR-10. The results demonstrate significant improvements in convergence speed and loss stability. Compared to traditional federated learning algorithms, our approach achieves faster and more stable convergence while effectively reducing training costs.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje