Zobrazeno 1 - 10
of 33
pro vyhledávání: '"Lacotte, Jonathan"'
Threshold activation functions are highly preferable in neural networks due to their efficiency in hardware implementations. Moreover, their mode of operation is more interpretable and resembles that of biological neurons. However, traditional gradie
Externí odkaz:
http://arxiv.org/abs/2303.03382
In second-order optimization, a potential bottleneck can be computing the Hessian matrix of the optimized function at every iteration. Randomized sketching has emerged as a powerful technique for constructing estimates of the Hessian which can be use
Externí odkaz:
http://arxiv.org/abs/2107.07480
We propose a randomized algorithm with quadratic convergence rate for convex optimization problems with a self-concordant, composite, strongly convex objective function. Our method is based on performing an approximate Newton step using a random proj
Externí odkaz:
http://arxiv.org/abs/2105.07291
Autor:
Lacotte, Jonathan, Pilanci, Mert
We consider least-squares problems with quadratic regularization and propose novel sketching-based iterative methods with an adaptive sketch size. The sketch size can be as small as the effective dimension of the data matrix to guarantee linear conve
Externí odkaz:
http://arxiv.org/abs/2104.14101
Autor:
Lacotte, Jonathan, Pilanci, Mert
We propose novel randomized optimization methods for high-dimensional convex problems based on restrictions of variables to random subspaces. We consider oblivious and data-adaptive subspaces and study their approximation properties via convex dualit
Externí odkaz:
http://arxiv.org/abs/2012.07054
We prove that finding all globally optimal two-layer ReLU neural networks can be performed by solving a convex optimization program with cone constraints. Our analysis is novel, characterizes all optimal solutions, and does not leverage duality-based
Externí odkaz:
http://arxiv.org/abs/2006.05900
Autor:
Lacotte, Jonathan, Pilanci, Mert
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching. We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT). Wh
Externí odkaz:
http://arxiv.org/abs/2006.05874
Autor:
Lacotte, Jonathan, Pilanci, Mert
We provide an exact analysis of a class of randomized algorithms for solving overdetermined least-squares problems. We consider first-order methods, where the gradients are pre-conditioned by an approximation of the Hessian, based on a subspace embed
Externí odkaz:
http://arxiv.org/abs/2002.09488
Random projections or sketching are widely used in many algorithmic and learning contexts. Here we study the performance of iterative Hessian sketch for least-squares problems. By leveraging and extending recent results from random matrix theory on t
Externí odkaz:
http://arxiv.org/abs/2002.00864
Autor:
Lacotte, Jonathan, Pilanci, Mert
We investigate iterative methods with randomized preconditioners for solving overdetermined least-squares problems, where the preconditioners are based on a random embedding of the data matrix. We consider two distinct approaches: the sketch is eithe
Externí odkaz:
http://arxiv.org/abs/1911.02675