A General Framework for Distributed Partitioned Optimization

Autor: Chezhegov, Savelii, Novitskii, Anton, Rogozin, Alexander, Parsegov, Sergei, Dvurechensky, Pavel, Gasnikov, Alexander
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
Popis: Decentralized optimization is widely used in large scale and privacy preserving machine learning and various distributed control and sensing systems. It is assumed that every agent in the network possesses a local objective function, and the nodes interact via a communication network. In the standard scenario, which is mostly studied in the literature, the local functions are dependent on a common set of variables, and, therefore, have to send the whole variable set at each communication round. In this work, we study a different problem statement, where each of the local functions held by the nodes depends only on some subset of the variables. Given a network, we build a general algorithm-independent framework for decentralized partitioned optimization that allows to construct algorithms with reduced communication load using a generalization of Laplacian matrix. Moreover, our framework allows to obtain algorithms with non-asymptotic convergence rates with explicit dependence on the parameters of the network, including accelerated and optimal first-order methods. We illustrate the efficacy of our approach on a synthetic example.
Databáze: arXiv