Zobrazeno 1 - 10
of 51
pro vyhledávání: '"Nassif, Roula"'
Communication-constrained algorithms for decentralized learning and optimization rely on local updates coupled with the exchange of compressed signals. In this context, differential quantization is an effective technique to mitigate the negative impa
Externí odkaz:
http://arxiv.org/abs/2406.18418
Classical paradigms for distributed learning, such as federated or decentralized gradient descent, employ consensus mechanisms to enforce homogeneity among agents. While these strategies have proven effective in i.i.d. scenarios, they can result in s
Externí odkaz:
http://arxiv.org/abs/2304.07358
Autor:
Nassif, Roula, Vlaski, Stefan, Carpentiero, Marco, Matta, Vincenzo, Antonini, Marc, Sayed, Ali H.
In this paper, we consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints that require the minimizers across the network to lie in low-dimensional subspaces. This constraine
Externí odkaz:
http://arxiv.org/abs/2209.07821
Observations collected by agents in a network may be unreliable due to observation noise or interference. This paper proposes a distributed algorithm that allows each node to improve the reliability of its own observation by relying solely on local c
Externí odkaz:
http://arxiv.org/abs/2203.09810
We study the problem of distributed estimation over adaptive networks where communication delays exist between nodes. In particular, we investigate the diffusion Least-Mean- Square (LMS) strategy where delayed intermediate estimates (due to the commu
Externí odkaz:
http://arxiv.org/abs/2004.08881
The problem of learning simultaneously several related tasks has received considerable attention in several domains, especially in machine learning with the so-called multitask learning problem or learning to learn problem [1], [2]. Multitask learnin
Externí odkaz:
http://arxiv.org/abs/2001.02112
In this work, we are interested in adaptive and distributed estimation of graph filters from streaming data. We formulate this problem as a consensus estimation problem over graphs, which can be addressed with diffusion LMS strategies. Most popular g
Externí odkaz:
http://arxiv.org/abs/1912.05805
This work introduces two strategies for training network classifiers with heterogeneous agents. One strategy promotes global smoothing over the graph and a second strategy promotes local smoothing over neighbourhoods. It is assumed that the feature s
Externí odkaz:
http://arxiv.org/abs/1911.04870
Part I of this paper considered optimization problems over networks where agents have individual objectives to meet, or individual parameter vectors to estimate, subject to subspace constraints that require the objectives across the network to lie in
Externí odkaz:
http://arxiv.org/abs/1906.12250
This paper considers optimization problems over networks where agents have individual objectives to meet, or individual parameter vectors to estimate, subject to subspace constraints that require the objectives across the network to lie in low-dimens
Externí odkaz:
http://arxiv.org/abs/1905.08750