Zobrazeno 1 - 10
of 154
pro vyhledávání: '"Choi, Woocheol"'
The gradient-push algorithm is a fundamental algorithm for the distributed optimization problem \begin{equation} \min_{x \in \mathbb{R}^d} f(x) = \sum_{j=1}^n f_j (x), \end{equation} where each local cost $f_j$ is only known to agent $a_i$ for $1 \le
Externí odkaz:
http://arxiv.org/abs/2407.13564
In this work, we establish the linear convergence estimate for the gradient descent involving the delay $\tau\in\mathbb{N}$ when the cost function is $\mu$-strongly convex and $L$-smooth. This result improves upon the well-known estimates in Arjevani
Externí odkaz:
http://arxiv.org/abs/2308.11984
Autor:
Choi, Woocheol, Lee, Myeong-Su
In this paper, we establish new convergence results for the quantized distributed gradient descent and suggest a novel strategy of choosing the stepsizes for the high-performance of the algorithm. Under the strongly convexity assumption on the aggreg
Externí odkaz:
http://arxiv.org/abs/2306.17481
Autor:
Choi, Woocheol
In this work, we establish convergence results for the distributed proximal point algorithm (DPPA) for distributed optimization problems. We consider the problem on the whole domain Rd and find a general condition on the stepsize and cost functions s
Externí odkaz:
http://arxiv.org/abs/2305.17383
In this paper, we consider the online proximal mirror descent for solving the time-varying composite optimization problems. For various applications, the algorithm naturally involves the errors in the gradient and proximal operator. We obtain sharp e
Externí odkaz:
http://arxiv.org/abs/2304.04710
Autor:
Choi, Woocheol, Kim, Jimyeong
In this work, we are concerned with the decentralized optimization problem: \begin{equation*} \min_{x \in \Omega}~f(x) = \frac{1}{n} \sum_{i=1}^n f_i (x), \end{equation*} where $\Omega \subset \mathbb{R}^d$ is a convex domain and each $f_i : \Omega \
Externí odkaz:
http://arxiv.org/abs/2303.08412
Autor:
Choi, Woocheol
In this paper, we consider the decentralized gradinet descent (DGD) given by \begin{equation*} x_i (t+1) = \sum_{j=1}^m w_{ij} x_j (t) - \alpha (t) \nabla f_i (x_i (t)). \end{equation*} We find a sharp range of the stepsize $\alpha (t)>0$ such that t
Externí odkaz:
http://arxiv.org/abs/2303.05755
Gradient-push algorithm has been widely used for decentralized optimization problems when the connectivity network is a direct graph. This paper shows that the gradient-push algorithm with stepsize $\alpha>0$ converges exponentially fast to an $O(\al
Externí odkaz:
http://arxiv.org/abs/2302.08779
Autor:
Choi, Woocheol, Kim, Jimyeong
Distributed optimization has received a lot of interest in recent years due to its wide applications in various fields. In this work, we revisit the convergence property of the decentralized gradient descent [A. Nedi{\'c}-A.Ozdaglar (2009)] on the wh
Externí odkaz:
http://arxiv.org/abs/2203.09079
In this paper, we analyze an operator splitting scheme of the nonlinear heat equation in $\Omega\subset\mathbb{R}^d$ ($d\geq 1$): $\partial_t u = \Delta u + \lambda |u|^{p-1} u$ in $\Omega\times(0,\infty)$, $u=0$ in $\partial\Omega\times(0,\infty)$,
Externí odkaz:
http://arxiv.org/abs/2202.01430