Zobrazeno 1 - 10
of 8 247
pro vyhledávání: '"Altschuler"'
Autor:
Altschuler, Jason M., Chewi, Sinho
Coupling arguments are a central tool for bounding the deviation between two stochastic processes, but traditionally have been limited to Wasserstein metrics. In this paper, we apply the shifted composition rule--an information-theoretic principle in
Externí odkaz:
http://arxiv.org/abs/2412.17997
We show that for separable convex optimization, random stepsizes fully accelerate Gradient Descent. Specifically, using inverse stepsizes i.i.d. from the Arcsine distribution improves the iteration complexity from $O(k)$ to $O(k^{1/2})$, where $k$ is
Externí odkaz:
http://arxiv.org/abs/2412.05790
Autor:
Bok, Jinho, Altschuler, Jason M.
Surprisingly, recent work has shown that gradient descent can be accelerated without using momentum -- just by judiciously choosing stepsizes. An open question raised by several papers is whether this phenomenon of stepsize-based acceleration holds m
Externí odkaz:
http://arxiv.org/abs/2412.05497
Publikováno v:
The Musical Times, 2024 Dec 01. 165(1969), 57-62.
Externí odkaz:
https://www.jstor.org/stable/27344481
A seminal result of Lee asserts that the Ramsey number of any bipartite $d$-degenerate graph $H$ satisfies $\log r(H) = \log n + O(d)$. In particular, this bound applies to every bipartite graph of maximal degree $\Delta$. It remains a compelling cha
Externí odkaz:
http://arxiv.org/abs/2410.18223
A seminal open question of Pisier and Mendel--Naor asks whether every degree-regular graph which satisfies the classical discrete Poincar\'e inequality for scalar functions, also satisfies an analogous inequality for functions taking values in \texti
Externí odkaz:
http://arxiv.org/abs/2410.04394
Noisy gradient descent and its variants are the predominant algorithms for differentially private machine learning. It is a fundamental question to quantify their privacy leakage, yet tight characterizations remain open even in the foundational setti
Externí odkaz:
http://arxiv.org/abs/2403.00278