Maximum likelihood-based online adaptation of hyper-parameters in CMA-ES
Autor: | Loshchilov, I., Schoenauer, M., Sebag, M., Nikolaus Hansen |
---|---|
Přispěvatelé: | Laboratory of Intelligent Systems (LIS), Laboratory of Intelligent Systems / EFPL-Laboratory of Intelligent Systems / EFPL, Laboratoire de Recherche en Informatique (LRI), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), Machine Learning and Optimisation (TAO), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria), Laboratoire de Recherche en Informatique, Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), Th. Bartz-Beielstein and J. Branke and B. Filipič and J. Smith, ANR-10-COSI-0002,SIMINOLE,Méthodes de simulations pour des applications de grande échelle en physique expérimentale : inférence statistique, optimisation et apprentissage discriminant(2010), Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université Paris-Sud - Paris 11 (UP11)-Laboratoire de Recherche en Informatique (LRI), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-CentraleSupélec |
Předmět: |
FOS: Computer and information sciences
derivative-free optimization Computer Science - Artificial Intelligence Computer Science::Neural and Evolutionary Computation MathematicsofComputing_NUMERICALANALYSIS Computer Science - Neural and Evolutionary Computing CMA-ES stochastic optimization Computer Science::Numerical Analysis [INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI] Artificial Intelligence (cs.AI) adaptation of hyper-parameters Astrophysics::Solar and Stellar Astrophysics Neural and Evolutionary Computing (cs.NE) hyper-parameters evolutionary algorithms |
Zdroj: | Scopus-Elsevier 13th International Conference on Parallel Problem Solving from Nature (PPSN 2014) 13th International Conference on Parallel Problem Solving from Nature (PPSN 2014), Sep 2014, Ljubljana, Slovenia. pp.70-79 CIÊNCIAVITAE |
Popis: | The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is widely accepted as a robust derivative-free continuous optimization algorithm for non-linear and non-convex optimization problems. CMA-ES is well known to be almost parameterless, meaning that only one hyper-parameter, the population size, is proposed to be tuned by the user. In this paper, we propose a principled approach called self-CMA-ES to achieve the online adaptation of CMA-ES hyper-parameters in order to improve its overall performance. Experimental results show that for larger-than-default population size, the default settings of hyper-parameters of CMA-ES are far from being optimal, and that self-CMA-ES allows for dynamically approaching optimal settings. Comment: 13th International Conference on Parallel Problem Solving from Nature (PPSN 2014) (2014) |
Databáze: | OpenAIRE |
Externí odkaz: |