Zobrazeno 1 - 10
of 42
pro vyhledávání: '"Brust, Johannes J"'
Autor:
Brust, Johannes J, Saunders, Michael A
For linear systems $Ax=b$ we develop iterative algorithms based on a sketch-and-project approach. By using judicious choices for the sketch, such as the history of residuals, we develop weighting strategies that enable short recursive formulas. The p
Externí odkaz:
http://arxiv.org/abs/2407.00746
Autor:
Brust, Johannes J.
For minimization problems without 2nd derivative information, methods that estimate Hessian matrices can be very effective. However, conventional techniques generate dense matrices that are prohibitive for large problems. Limited-memory compact repre
Externí odkaz:
http://arxiv.org/abs/2403.12206
Autor:
Brust, Johannes J, Gill, Philip E
For quasi-Newton methods in unconstrained minimization, it is valuable to develop methods that are robust, i.e., methods that converge on a large number of problems. Trust-region algorithms are often regarded to be more robust than line-search method
Externí odkaz:
http://arxiv.org/abs/2312.06884
In this work, we consider methods for large-scale and nonconvex unconstrained optimization. We propose a new trust-region method whose subproblem is defined using a so-called "shape-changing" norm together with densely-initialized multipoint symmetri
Externí odkaz:
http://arxiv.org/abs/2209.12057
Publikováno v:
Computational Optimization and Applications 80:55-88 (2021)
For general large-scale optimization problems compact representations exist in which recursive quasi-Newton update formulas are represented as compact matrix factorizations. For problems in which the objective function contains additional structure,
Externí odkaz:
http://arxiv.org/abs/2208.00057
Publikováno v:
SIAM Journl. Sci. Comput. 45(2), 2023
We propose iterative projection methods for solving square or rectangular consistent linear systems Ax = b. Existing projection methods use sketching matrices (possibly randomized) to generate a sequence of small projected subproblems, but even the s
Externí odkaz:
http://arxiv.org/abs/2207.07615
Autor:
Brust, Johannes J.
Publikováno v:
Proceedings of the 38th International Conference on Machine Learning, PMLR 139, 2021
For large nonlinear least squares loss functions in machine learning we exploit the property that the number of model parameters typically exceeds the data in one batch. This implies a low-rank structure in the Hessian of the loss, which enables effe
Externí odkaz:
http://arxiv.org/abs/2107.05598
Autor:
Brust, Johannes J., Anitescu, Mihai
Publikováno v:
IEEE Trans. Pow. Sys. 37(6), 2022
For optimal power flow problems with chance constraints, a particularly effective method is based on a fixed point iteration applied to a sequence of deterministic power flow problems. However, a priori, the convergence of such an approach is not nec
Externí odkaz:
http://arxiv.org/abs/2101.11740
Publikováno v:
SIAM Journl. Sci. Comput. 44(1), 2022
For optimization problems with linear equality constraints, we prove that the (1,1) block of the inverse KKT matrix remains unchanged when projected onto the nullspace of the constraint matrix. We develop reduced compact representations of the limite
Externí odkaz:
http://arxiv.org/abs/2101.11048
Autor:
BRUST, JOHANNES J., GILL, PHILIP E.
Publikováno v:
SIAM Journal on Scientific Computing; 2024, Vol. 46 Issue 5, pA3330-A3351, 22p