Zobrazeno 1 - 10
of 23 950
pro vyhledávání: '"Gaussian smoothing"'
This article introduces a novel family of optimization algorithms - Anisotropic Gaussian Smoothing Gradient Descent (AGS-GD), AGS-Stochastic Gradient Descent (AGS-SGD), and AGS-Adam - that employ anisotropic Gaussian smoothing to enhance traditional
Externí odkaz:
http://arxiv.org/abs/2411.11747
Autor:
Xu, Chen
We propose a novel method that solves global optimization problems in two steps: (1) perform a (exponential) power-$N$ transformation to the not-necessarily differentiable objective function $f$ and get $f_N$, and (2) optimize the Gaussian-smoothed $
Externí odkaz:
http://arxiv.org/abs/2412.05204
We study the class of subdifferentially polynomially bounded (SPB) functions, which is a rich class of locally Lipschitz functions that encompasses all Lipschitz functions, all gradient- or Hessian-Lipschitz functions, and even some non-smooth locall
Externí odkaz:
http://arxiv.org/abs/2405.04150
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Aghasi, Alireza, Ghadimi, Saeed
In this paper, we study and analyze zeroth-order stochastic approximation algorithms for solving bilvel problems, when neither the upper/lower objective values, nor their unbiased gradient estimates are available. In particular, exploiting Stein's id
Externí odkaz:
http://arxiv.org/abs/2404.00158
Autor:
Starnes, Andrew, Webster, Clayton
This paper formalizes and analyzes Gaussian smoothing applied to two prominent optimization methods: Stochastic Gradient Descent (GSmoothSGD) and Adam (GSmoothAdam) in deep learning. By attenuating small fluctuations, Gaussian smoothing lowers the ri
Externí odkaz:
http://arxiv.org/abs/2311.00531
This work analyzes the convergence of a class of smoothing-based gradient descent methods when applied to optimization problems. In particular, Gaussian smoothing is employed to define a nonlocal gradient that reduces high-frequency noise, small vari
Externí odkaz:
http://arxiv.org/abs/2311.00521
Autor:
Lindeberg, Tony
Publikováno v:
Journal of Mathematical Imaging and Vision, 2024
This paper develops an in-depth treatment concerning the problem of approximating the Gaussian smoothing and Gaussian derivative computations in scale-space theory for application on discrete data. With close connections to previous axiomatic treatme
Externí odkaz:
http://arxiv.org/abs/2311.11317
We analyze the convergence of a nonlocal gradient descent method for minimizing a class of high-dimensional non-convex functions, where a directional Gaussian smoothing (DGS) is proposed to define the nonlocal gradient (also referred to as the DGS gr
Externí odkaz:
http://arxiv.org/abs/2302.06404
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.