Optimization by Adaptive Stochastic Descent.

Autor: Cliff C Kerr, Salvador Dura-Bernal, Tomasz G Smolinski, George L Chadderdon, David P Wilson
Jazyk: angličtina
Rok vydání: 2018
Předmět:
Zdroj: PLoS ONE, Vol 13, Iss 3, p e0192944 (2018)
Druh dokumentu: article
ISSN: 1932-6203
DOI: 10.1371/journal.pone.0192944
Popis: When standard optimization methods fail to find a satisfactory solution for a parameter fitting problem, a tempting recourse is to adjust parameters manually. While tedious, this approach can be surprisingly powerful in terms of achieving optimal or near-optimal solutions. This paper outlines an optimization algorithm, Adaptive Stochastic Descent (ASD), that has been designed to replicate the essential aspects of manual parameter fitting in an automated way. Specifically, ASD uses simple principles to form probabilistic assumptions about (a) which parameters have the greatest effect on the objective function, and (b) optimal step sizes for each parameter. We show that for a certain class of optimization problems (namely, those with a moderate to large number of scalar parameter dimensions, especially if some dimensions are more important than others), ASD is capable of minimizing the objective function with far fewer function evaluations than classic optimization methods, such as the Nelder-Mead nonlinear simplex, Levenberg-Marquardt gradient descent, simulated annealing, and genetic algorithms. As a case study, we show that ASD outperforms standard algorithms when used to determine how resources should be allocated in order to minimize new HIV infections in Swaziland.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje