Zobrazeno 1 - 10
of 271
pro vyhledávání: '"NOCEDAL, JORGE"'
Autor:
Sun, Shigeng, Nocedal, Jorge
This paper introduces a modified Byrd-Omojokun (BO) trust region algorithm to address the challenges posed by noisy function and gradient evaluations. The original BO method was designed to solve equality constrained problems and it forms the backbon
Externí odkaz:
http://arxiv.org/abs/2411.02665
Autor:
Xuan, Melody Qiming, Nocedal, Jorge
This paper explores a method for solving constrained optimization problems when the derivatives of the objective function are unavailable, while the derivatives of the constraints are known. We allow the objective and constraint function to be noncon
Externí odkaz:
http://arxiv.org/abs/2402.11920
The development of nonlinear optimization algorithms capable of performing reliably in the presence of noise has garnered considerable attention lately. This paper advocates for strategies to create noise-tolerant nonlinear optimization algorithms by
Externí odkaz:
http://arxiv.org/abs/2401.15007
Autor:
Sun, Shigeng, Nocedal, Jorge
Classical trust region methods were designed to solve problems in which function and gradient information are exact. This paper considers the case when there are bounded errors (or noise) in the above computations and proposes a simple modification o
Externí odkaz:
http://arxiv.org/abs/2201.00973
A common approach for minimizing a smooth nonlinear function is to employ finite-difference approximations to the gradient. While this can be easily performed when no error is present within the function evaluations, when the function is noisy, the o
Externí odkaz:
http://arxiv.org/abs/2110.06380
The problem of interest is the minimization of a nonlinear function subject to nonlinear equality constraints using a sequential quadratic programming (SQP) method. The minimization must be performed while observing only noisy evaluations of the obje
Externí odkaz:
http://arxiv.org/abs/2110.04355
The goal of this paper is to investigate an approach for derivative-free optimization that has not received sufficient attention in the literature and is yet one of the simplest to implement and parallelize. It consists of computing gradients of a sm
Externí odkaz:
http://arxiv.org/abs/2102.09762
The motivation for this paper stems from the desire to develop an adaptive sampling method for solving constrained optimization problems in which the objective function is stochastic and the constraints are deterministic. The method proposed in this
Externí odkaz:
http://arxiv.org/abs/2012.15411
This paper describes an extension of the BFGS and L-BFGS methods for the minimization of a nonlinear function subject to errors. This work is motivated by applications that contain computational noise, employ low-precision arithmetic, or are subject
Externí odkaz:
http://arxiv.org/abs/2010.04352
The classical convergence analysis of quasi-Newton methods assumes that the function and gradients employed at each iteration are exact. In this paper, we consider the case when there are (bounded) errors in both computations and establish conditions
Externí odkaz:
http://arxiv.org/abs/1901.09063