Zobrazeno 1 - 10
of 289
pro vyhledávání: '"Campbell, Trevor A."'
A Bayesian coreset is a small, weighted subset of a data set that replaces the full data during inference to reduce computational cost. The state-of-the-art coreset construction algorithm, Coreset Markov chain Monte Carlo (Coreset MCMC), uses draws f
Externí odkaz:
http://arxiv.org/abs/2410.18973
Autor:
Liu, Tiange, Surjanovic, Nikola, Biron-Lattes, Miguel, Bouchard-Côté, Alexandre, Campbell, Trevor
Many common Markov chain Monte Carlo (MCMC) kernels can be formulated using a deterministic involutive proposal with a step size parameter. Selecting an appropriate step size is often a challenging task in practice; and for complex multiscale targets
Externí odkaz:
http://arxiv.org/abs/2410.18929
Autor:
Luu, Son, Xu, Zuheng, Surjanovic, Nikola, Biron-Lattes, Miguel, Campbell, Trevor, Bouchard-Côté, Alexandre
The Hamiltonian Monte Carlo (HMC) algorithm is often lauded for its ability to effectively sample from high-dimensional distributions. In this paper we challenge the presumed domination of HMC for the Bayesian analysis of GLMs. By utilizing the struc
Externí odkaz:
http://arxiv.org/abs/2410.03630
Autor:
Campbell, Trevor
Bayesian coresets speed up posterior inference in the large-scale data regime by approximating the full-data log-likelihood function with a surrogate log-likelihood based on a small, weighted subset of the data. But while Bayesian coresets and method
Externí odkaz:
http://arxiv.org/abs/2405.11780
Non-reversible parallel tempering (NRPT) is an effective algorithm for sampling from target distributions with complex geometry, such as those arising from posterior distributions of weakly identifiable and high-dimensional Bayesian models. In this w
Externí odkaz:
http://arxiv.org/abs/2405.11384
This paper is intended to appear as a chapter for the Handbook of Markov Chain Monte Carlo. The goal of this chapter is to unify various problems at the intersection of Markov chain Monte Carlo (MCMC) and machine learning$\unicode{x2014}$which includ
Externí odkaz:
http://arxiv.org/abs/2402.09598
Autor:
Biron-Lattes, Miguel, Surjanovic, Nikola, Syed, Saifuddin, Campbell, Trevor, Bouchard-Côté, Alexandre
Selecting the step size for the Metropolis-adjusted Langevin algorithm (MALA) is necessary in order to obtain satisfactory performance. However, finding an adequate step size for an arbitrary target distribution can be a difficult task and even the b
Externí odkaz:
http://arxiv.org/abs/2310.16782
Autor:
Chen, Naitong, Campbell, Trevor
A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during inference in order to reduce computational cost. However, state of the art methods for tuning coreset weights are expensive, require nontrivial user input, a
Externí odkaz:
http://arxiv.org/abs/2310.17063
Simulated Tempering (ST) is an MCMC algorithm for complex target distributions that operates on a path between the target and a more amenable reference distribution. Crucially, if the reference enables i.i.d. sampling, ST is regenerative and can be p
Externí odkaz:
http://arxiv.org/abs/2309.05578
Variational flows allow practitioners to learn complex continuous distributions, but approximating discrete distributions remains a challenge. Current methodologies typically embed the discrete target in a continuous space - usually via continuous re
Externí odkaz:
http://arxiv.org/abs/2308.15613