Zobrazeno 1 - 7
of 7
pro vyhledávání: '"Emanuele Frandi"'
Publikováno v:
Machine Learning. 104:195-221
Frank–Wolfe (FW) algorithms have been often proposed over the last few years as efficient solvers for a variety of optimization problems arising in the field of machine learning. The ability to work with cheap projection-free iterations and the inc
Autor:
Emanuele Frandi, Alessandra Papini
Publikováno v:
Optimization Methods and Software. 30:1077-1094
Direct Search algorithms are classical derivative-free methods for optimization. Though endowed with solid theoretical properties, they are not well suited for large-scale problems due to slow convergence and scaling issues. In this paper, we discuss
Autor:
Alessandra Papini, Emanuele Frandi
Publikováno v:
Optimization Methods and Software. 29:1020-1041
Many optimization problems of practical interest arise from the discretization of continuous problems. Classical examples can be found in calculus of variations, optimal control and image processing. In the recent years a number of strategies have be
Publikováno v:
Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications ISBN: 9783319522760
CIARP
CIARP
© Springer International Publishing AG 2017. Performing predictions using a non-linear support vector machine (SVM) can be too expensive in some large-scale scenarios. In the non-linear case, the complexity of storing and using the classifier is det
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::d33ddeeacdf7c3fb45b3d0b78abae391
https://lirias.kuleuven.be/handle/123456789/633595
https://lirias.kuleuven.be/handle/123456789/633595
Publikováno v:
IJCNN
Frank-Wolfe algorithms have recently regained the attention of the Machine Learning community. Their solid theoretical properties and sparsity guarantees make them a suitable choice for a wide range of problems in this field. In addition, several var
Publikováno v:
Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications ISBN: 9783642166860
CIARP
CIARP
It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball (MEB) problems in a certain feature space. Exploiting this reduction, efficient algorithms to scale up Support Vector Machines (SVMs) and other kernel
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::c82dee2a17e6eab56aae3b4811122468
http://hdl.handle.net/11585/92562
http://hdl.handle.net/11585/92562
Training a support vector machine (SVM) requires the solution of a quadratic programming problem (QP) whose computational complexity becomes prohibitively expensive for large scale datasets. Traditional optimization methods cannot be directly applied
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::1bfd9d4e23c6a0e5e5aa80b321bf5632