On the Convergence of Federated Averaging under Partial Participation for Over-parameterized Neural Networks

Autor: Liu, Xin, li, Wei, Zhan, Dazhi, Pan, Yu, Ma, Xin, Ding, Yu, Pan, Zhisong
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: Federated learning (FL) is a widely employed distributed paradigm for collaboratively training machine learning models from multiple clients without sharing local data. In practice, FL encounters challenges in dealing with partial client participation due to the limited bandwidth, intermittent connection and strict synchronized delay. Simultaneously, there exist few theoretical convergence guarantees in this practical setting, especially when associated with the non-convex optimization of neural networks. To bridge this gap, we focus on the training problem of federated averaging (FedAvg) method for two canonical models: a deep linear network and a two-layer ReLU network. Under the over-parameterized assumption, we provably show that FedAvg converges to a global minimum at a linear rate $\mathcal{O}\left((1-\frac{min_{i \in [t]}|S_i|}{N^2})^t\right)$ after $t$ iterations, where $N$ is the number of clients and $|S_i|$ is the number of the participated clients in the $i$-th iteration. Experimental evaluations confirm our theoretical results.
Comment: The partial participation setting may incur some problems in deriving its convergence
Databáze: arXiv