Autor: |
Huang, Jianjun, Rui, Zihao, Kang, Li |
Předmět: |
|
Zdroj: |
Complex & Intelligent Systems; Apr2024, Vol. 10 Issue 2, p2499-2514, 16p |
Abstrakt: |
Federated learning (FL) represents a promising distributed machine learning paradigm for resolving data isolation due to data privacy concerns. Nevertheless, most vanilla FL algorithms, which depend on a server, encounter the problem of reliability and a high communication burden in real cases. Decentralized federated learning (DFL) that does not follow the star topology faces the challenges of weight divergence and inferior communication efficiency. In this paper, a novel DFL framework called federated incremental subgradient-proximal (FedISP) is proposed that utilizes the incremental method to perform model updates to alleviate weight divergence. In our setup, multiple clients are distributed in a ring topology and communicate in a cyclic manner, which significantly mitigates the communication load. A convergence guarantee is given under the convex condition to demonstrate the impact of the learning rate on our algorithms, which further improves the performance of FedISP. Extensive experiments on benchmark datasets validate the effectiveness of the proposed approach in both independent and identically distributed (IID) and non-IID settings while illustrating the advantages of the FedISP algorithm in achieving model consensus and saving communication costs. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|