Zobrazeno 1 - 10
of 20
pro vyhledávání: '"Latorre, Fabian"'
This paper rethinks Sharpness-Aware Minimization (SAM), which is originally formulated as a zero-sum game where the weights of a network and a bounded perturbation try to minimize/maximize, respectively, the same differentiable loss. To fundamentally
Externí odkaz:
http://arxiv.org/abs/2407.12993
One prominent approach toward resolving the adversarial vulnerability of deep neural networks is the two-player zero-sum paradigm of adversarial training, in which predictors are trained against adversarially chosen perturbations of data. Despite the
Externí odkaz:
http://arxiv.org/abs/2306.11035
Dynamic Time Warping (DTW) has become the pragmatic choice for measuring distance between time series. However, it suffers from unavoidable quadratic time complexity when the optimal alignment matrix needs to be computed exactly. This hinders its use
Externí odkaz:
http://arxiv.org/abs/2306.00620
While the class of Polynomial Nets demonstrates comparable performance to neural networks (NN), it currently has neither theoretical generalization characterization nor robustness guarantees. To this end, we derive new complexity bounds for the set o
Externí odkaz:
http://arxiv.org/abs/2202.05068
We mainly analyze and solve the overfitting problem of deep image prior (DIP). Deep image prior can solve inverse problems such as super-resolution, inpainting and denoising. The main advantage of DIP over other deep learning approaches is that it do
Externí odkaz:
http://arxiv.org/abs/2011.01748
We demonstrate two new important properties of the 1-path-norm of shallow neural networks. First, despite its non-smoothness and non-convexity it allows a closed form proximal operator which can be efficiently computed, allowing the use of stochastic
Externí odkaz:
http://arxiv.org/abs/2007.01003
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bounds on the Lipschitz constant of neural networks. The underlying optimization problems boil down to either linear (LP) or semidefinite (SDP) programm
Externí odkaz:
http://arxiv.org/abs/2004.08688
Publikováno v:
In proceedings of NeurIPS 2019, pages 13943-13955, volume 32: http://papers.nips.cc/paper/9545-an-inexact-augmented-lagrangian-framework-for-nonconvex-optimization-with-nonlinear-constraints
We propose a practical inexact augmented Lagrangian method (iALM) for nonconvex problems with nonlinear constraints. We characterize the total computational complexity of our method subject to a verifiable geometric condition, which is closely relate
Externí odkaz:
http://arxiv.org/abs/1906.11357
Autor:
Latorre, Fabián
In the context of Structural Risk Minimization, one is presented a sequence of classes $\{\mathcal{G}_j\}$ from which, given a random sample $(X_i,Y_i)$ one wants to choose a strongly consistent estimator. For certain types of classes of functions, w
Externí odkaz:
http://arxiv.org/abs/1609.02855
Autor:
Latorre, Fabian
The problem of Maxflow is a widely developed subject in modern mathematics. Efficient algorithms exist to solve this problem, that is why a good generalization may permit these algorithms to be understood as a particular instance of solutions in a wi
Externí odkaz:
http://arxiv.org/abs/1212.1406