Zobrazeno 1 - 10
of 124
pro vyhledávání: '"Ročková, Veronika"'
We develop a multivariate posterior sampling procedure through deep generative quantile learning. Simulation proceeds implicitly through a push-forward mapping that can transform i.i.d. random vector samples from the posterior. We utilize Monge-Kanto
Externí odkaz:
http://arxiv.org/abs/2410.08378
This work is concerned with conformal prediction in contemporary applications (including generative AI) where a black-box model has been trained on data that are not accessible to the user. Mirroring split-conformal inference, we design a wrapper aro
Externí odkaz:
http://arxiv.org/abs/2408.08990
In generative models with obscured likelihood, Approximate Bayesian Computation (ABC) is often the tool of last resort for inference. However, ABC demands many prior parameter trials to keep only a small fraction that passes an acceptance test. To ac
Externí odkaz:
http://arxiv.org/abs/2404.10436
Autor:
Kim, Jungeum, Rockova, Veronika
The is no other model or hypothesis verification tool in Bayesian statistics that is as widely used as the Bayes factor. We focus on generative models that are likelihood-free and, therefore, render the computation of Bayes factors (marginal likeliho
Externí odkaz:
http://arxiv.org/abs/2312.05411
Multivariate Item Response Theory (MIRT) is sought-after widely by applied researchers looking for interpretable (sparse) explanations underlying response patterns in questionnaire data. There is, however, an unmet demand for such sparsity discovery
Externí odkaz:
http://arxiv.org/abs/2310.17820
Autor:
Rockova, Veronika
Bayesian predictive inference provides a coherent description of entire predictive uncertainty through predictive distributions. We examine several widely used sparsity priors from the predictive (as opposed to estimation) inference viewpoint. To sta
Externí odkaz:
http://arxiv.org/abs/2309.02369
Autor:
Kim, Jungeum, Rockova, Veronika
The success of Bayesian inference with MCMC depends critically on Markov chains rapidly reaching the posterior distribution. Despite the plentitude of inferential theory for posteriors in Bayesian non-parametrics, convergence properties of MCMC algor
Externí odkaz:
http://arxiv.org/abs/2306.00126
Autor:
Wang, Yuexi, Ročková, Veronika
In the absence of explicit or tractable likelihoods, Bayesians often resort to approximate Bayesian computation (ABC) for inference. Our work bridges ABC with deep neural implicit samplers based on generative adversarial networks (GANs) and adversari
Externí odkaz:
http://arxiv.org/abs/2208.12113
Autor:
Nie, Lizhen, Rockova, Veronika
For a Bayesian, the task to define the likelihood can be as perplexing as the task to define the prior. We focus on situations when the parameter of interest has been emancipated from the likelihood and is linked to data directly through a loss funct
Externí odkaz:
http://arxiv.org/abs/2205.15374
Approximate Bayesian Computation (ABC) enables statistical inference in simulator-based models whose likelihoods are difficult to calculate but easy to simulate from. ABC constructs a kernel-type approximation to the posterior distribution through an
Externí odkaz:
http://arxiv.org/abs/2111.11507