Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Gruntkowska, Kaja"'
We revisit FedExProx - a recently proposed distributed optimization method designed to enhance convergence properties of parallel proximal algorithms via extrapolation. In the process, we uncover a surprising flaw: its known theoretical guarantees on
Externí odkaz:
http://arxiv.org/abs/2410.15368
In practical distributed systems, workers are typically not homogeneous, and due to differences in hardware configurations and network conditions, can have highly varying processing times. We consider smooth nonconvex finite-sum (empirical risk minim
Externí odkaz:
http://arxiv.org/abs/2405.15545
Effective communication between the server and workers plays a key role in distributed optimization. In this paper, we focus on optimizing the server-to-worker communication, uncovering inefficiencies in prevalent downlink compression approaches. Con
Externí odkaz:
http://arxiv.org/abs/2402.06412
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Byzantine robustness is an essential feature of algorithms for certain distributed optimization problems, typically encountered in collaborative/federated learning. These problems are usually huge-scale, implying that communication compression is als
Externí odkaz:
http://arxiv.org/abs/2310.09804
Publikováno v:
Proceedings of the 40th International Conference on Machine Learning, Honolulu, Hawaii, USA. PMLR 202, 2023
In this work we focus our attention on distributed optimization problems in the context where the communication time between the server and the workers is non-negligible. We obtain novel methods supporting bidirectional compression (both from the ser
Externí odkaz:
http://arxiv.org/abs/2209.15218