Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Sun, Shigeng"'
Autor:
Sun, Shigeng, Nocedal, Jorge
This paper introduces a modified Byrd-Omojokun (BO) trust region algorithm to address the challenges posed by noisy function and gradient evaluations. The original BO method was designed to solve equality constrained problems and it forms the backbon
Externí odkaz:
http://arxiv.org/abs/2411.02665
The development of nonlinear optimization algorithms capable of performing reliably in the presence of noise has garnered considerable attention lately. This paper advocates for strategies to create noise-tolerant nonlinear optimization algorithms by
Externí odkaz:
http://arxiv.org/abs/2401.15007
Autor:
Sun, Shigeng, Xie, Yuchen
Many machine learning applications and tasks rely on the stochastic gradient descent (SGD) algorithm and its variants. Effective step length selection is crucial for the success of these algorithms, which has motivated the development of algorithms s
Externí odkaz:
http://arxiv.org/abs/2305.09978
Autor:
Sun, Shigeng, Nocedal, Jorge
Classical trust region methods were designed to solve problems in which function and gradient information are exact. This paper considers the case when there are bounded errors (or noise) in the above computations and proposes a simple modification o
Externí odkaz:
http://arxiv.org/abs/2201.00973
Autor:
Sun, Shigeng1 (AUTHOR), Nocedal, Jorge2 (AUTHOR) j-nocedal@northwestern.edu
Publikováno v:
Mathematical Programming. Nov2023, Vol. 202 Issue 1/2, p445-472. 28p.