Zobrazeno 1 - 10
of 75
pro vyhledávání: '"Franca, Guilherme"'
Publikováno v:
J. Phys.: Conf. Ser. 2667 (2023) 012027
The gauge-Miura correspondence establishes a map between the entire KdV and mKdV hierarchies, including positive and also negative flows, from which new relations besides the standard Miura transformation arise. We use this correspondence to classify
Externí odkaz:
http://arxiv.org/abs/2312.14101
Publikováno v:
J. High Energ. Phys. 2023, 160 (2023)
The KdV hierarchy is a paradigmatic example of the rich mathematical structure underlying integrable systems and has far-reaching connections in several areas of theoretical physics. While the positive part of the KdV hierarchy is well known, in this
Externí odkaz:
http://arxiv.org/abs/2304.01749
In this work, we introduce Modern Portfolio Theory using basic concepts from linear algebra, differential calculus, statistics, and optimization. This theory allows us to measure the return and risk of an investment portfolio, serving as a basis for
Externí odkaz:
http://arxiv.org/abs/2208.07909
Autor:
Carvalho-Pereira, João ⁎, Santos-Moreira, André, Cunha, Paulo-Diogo, Azevedo, Joana, Barbosa, Tiago, Varanda, Pedro, Lacueva-França, Guilherme
Publikováno v:
In Fuss und Sprunggelenk December 2024 22(4):269-275
Autor:
Barp, Alessandro, Da Costa, Lancelot, França, Guilherme, Friston, Karl, Girolami, Mark, Jordan, Michael I., Pavliotis, Grigorios A.
Publikováno v:
Handbook of Statistics, vol. 46, pp. 21--78 (2022)
In this chapter, we identify fundamental geometric structures that underlie the problems of sampling, optimisation, inference and adaptive decision-making. Based on this identification, we derive algorithms that exploit these geometric structures to
Externí odkaz:
http://arxiv.org/abs/2203.10592
Optimization tasks are crucial in statistical machine learning. Recently, there has been great interest in leveraging tools from dynamical systems to derive accelerated and robust optimization methods via suitable discretizations of continuous-time s
Externí odkaz:
http://arxiv.org/abs/2107.11231
Autor:
França, Guilherme, Bento, José
There has been an increasing necessity for scalable optimization methods, especially due to the explosion in the size of datasets and model complexity in modern machine learning applications. Scalable solvers often distribute the computation over a n
Externí odkaz:
http://arxiv.org/abs/2009.02604
Publikováno v:
J. Stat. Mech. (2021) 043402
Recently, continuous-time dynamical systems have proved useful in providing conceptual and quantitative insights into gradient-based optimization, widely used in modern machine learning and statistics. An important question that arises in this line o
Externí odkaz:
http://arxiv.org/abs/2004.06840
Publikováno v:
Phys. Rev. E 103, 053304 (2021)
Optimization is at the heart of machine learning, statistics and many applied scientific disciplines. It also has a long history in physics, ranging from the minimal action principle to finding ground states of disordered systems such as spin glasses
Externí odkaz:
http://arxiv.org/abs/1908.00865
Publikováno v:
J. Stat. Mech. (2020) 124008
Arguably, the two most popular accelerated or momentum-based optimization methods in machine learning are Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different discretizations of a particular second order different
Externí odkaz:
http://arxiv.org/abs/1903.04100