NGO-GM: Natural Gradient Optimization for Graphical Models
Autor: | Rida Laraki, David Saltiel, Jamal Atif, Eric Benhamou |
---|---|
Přispěvatelé: | Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision (LAMSADE), Université Paris Dauphine-PSL, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL)-Centre National de la Recherche Scientifique (CNRS), Université du Littoral Côte d'Opale (ULCO) |
Jazyk: | angličtina |
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Meta parameters Mathematical optimization Computer Science - Machine Learning Optimization problem Computer science Trend detection Machine Learning (stat.ML) Overfitting 01 natural sciences Machine Learning (cs.LG) [INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI] 010309 optics 03 medical and health sciences 0302 clinical medicine Distribution (mathematics) Statistics - Machine Learning 0103 physical sciences Graphical model [MATH.MATH-OC]Mathematics [math]/Optimization and Control [math.OC] Natural gradient 030217 neurology & neurosurgery Descent (mathematics) |
Popis: | This paper deals with estimating model parameters in graphical models. We reformulate it as an information geometric optimization problem and introduce a natural gradient descent strategy that incorporates additional meta parameters. We show that our approach is a strong alternative to the celebrated EM approach for learning in graphical models. Actually, our natural gradient based strategy leads to learning optimal parameters for the final objective function without artificially trying to fit a distribution that may not correspond to the real one. We support our theoretical findings with the question of trend detection in financial markets and show that the learned model performs better than traditional practitioner methods and is less prone to overfitting. 18 pages, 9 figures |
Databáze: | OpenAIRE |
Externí odkaz: |