Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Tatzel, Lukas"'
Quadratic approximations form a fundamental building block of machine learning methods. E.g., second-order optimizers try to find the Newton step into the minimum of a local quadratic proxy to the objective function; and the second-order approximatio
Externí odkaz:
http://arxiv.org/abs/2410.14325
Autor:
Roy, Hrittik, Miani, Marco, Ek, Carl Henrik, Hennig, Philipp, Pförtner, Marvin, Tatzel, Lukas, Hauberg, Søren
Current approximate posteriors in Bayesian neural networks (BNNs) exhibit a crucial limitation: they fail to maintain invariance under reparameterization, i.e. BNNs assign different posterior densities to different parametrizations of identical funct
Externí odkaz:
http://arxiv.org/abs/2406.03334
Bayesian Generalized Linear Models (GLMs) define a flexible probabilistic framework to model categorical, ordinal and continuous data, and are widely used in practice. However, exact inference in GLMs is prohibitively expensive for large datasets, th
Externí odkaz:
http://arxiv.org/abs/2310.20285
Curvature in form of the Hessian or its generalized Gauss-Newton (GGN) approximation is valuable for algorithms that rely on a local model for the loss to train, compress, or explain deep networks. Existing methods based on implicit multiplication vi
Externí odkaz:
http://arxiv.org/abs/2106.02624