Zobrazeno 1 - 10
of 234
pro vyhledávání: '"Zanella, Giacomo"'
Generalized linear mixed models (GLMMs) are a widely used tool in statistical analysis. The main bottleneck of many computational approaches lies in the inversion of the high dimensional precision matrices associated with the random effects. Such mat
Externí odkaz:
http://arxiv.org/abs/2411.04729
Autor:
Pozza, Francesco, Zanella, Giacomo
We study multiproposal Markov chain Monte Carlo algorithms, such as Multiple-try or generalised Metropolis-Hastings schemes, which have recently received renewed attention due to their amenability to parallel computing. First, we prove that no multip
Externí odkaz:
http://arxiv.org/abs/2410.23174
The logit transform is arguably the most widely-employed link function beyond linear settings. This transformation routinely appears in regression models for binary data and provides, either explicitly or implicitly, a core building-block within stat
Externí odkaz:
http://arxiv.org/abs/2410.10309
Autor:
Ceriani, Paolo Maria, Zanella, Giacomo
We design and analyze unbiased Markov chain Monte Carlo (MCMC) schemes based on couplings of blocked Gibbs samplers (BGSs), whose total computational costs scale linearly with the number of parameters and data points. Our methodology is designed for
Externí odkaz:
http://arxiv.org/abs/2410.08939
The Gibbs sampler (a.k.a. Glauber dynamics and heat-bath algorithm) is a popular Markov Chain Monte Carlo algorithm which iteratively samples from the conditional distributions of a probability measure $\pi$ of interest. Under the assumption that $\p
Externí odkaz:
http://arxiv.org/abs/2410.00858
Autor:
Lavenant, Hugo, Zanella, Giacomo
The Coordinate Ascent Variational Inference scheme is a popular algorithm used to compute the mean-field approximation of a probability distribution of interest. We analyze its random scan version, under log-concavity assumptions on the target densit
Externí odkaz:
http://arxiv.org/abs/2406.07292
Autor:
Mauri, Lorenzo, Zanella, Giacomo
Publikováno v:
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, AISTATS'24, volume 238, 2024, page 2107-2115
Stochastic Gradient (SG) Markov Chain Monte Carlo algorithms (MCMC) are popular algorithms for Bayesian sampling in the presence of large datasets. However, they come with little theoretical guarantees and assessing their empirical performances is no
Externí odkaz:
http://arxiv.org/abs/2405.08999
We study general coordinate-wise MCMC schemes (such as Metropolis-within-Gibbs samplers), which are commonly used to fit Bayesian non-conjugate hierarchical models. We relate their convergence properties to the ones of the corresponding (potentially
Externí odkaz:
http://arxiv.org/abs/2403.09416
While generalized linear mixed models (GLMMs) are a fundamental tool in applied statistics, many specifications -- such as those involving categorical factors with many levels or interaction terms -- can be computationally challenging to estimate due
Externí odkaz:
http://arxiv.org/abs/2312.13148
Autor:
Ascolani, Filippo, Zanella, Giacomo
Gibbs samplers are popular algorithms to approximate posterior distributions arising from Bayesian hierarchical models. Despite their popularity and good empirical performances, however, there are still relatively few quantitative results on their co
Externí odkaz:
http://arxiv.org/abs/2304.06993