Generalization of Jeffreys Divergence-Based Priors for Bayesian Hypothesis Testing
Autor: | Maria J. Bayarri, Gonzalo García-Donato |
---|---|
Rok vydání: | 2008 |
Předmět: |
Statistics and Probability
Kullback–Leibler divergence Markov chain Markov chain Monte Carlo Bayes factor Mixture model symbols.namesake Prior probability Econometrics symbols Applied mathematics Statistics Probability and Uncertainty Divergence (statistics) Statistical hypothesis testing Mathematics |
Zdroj: | Journal of the Royal Statistical Society Series B: Statistical Methodology. 70:981-1003 |
ISSN: | 1467-9868 1369-7412 |
DOI: | 10.1111/j.1467-9868.2008.00667.x |
Popis: | Summary We introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence-based (DB) priors. DB priors have simple forms and desirable properties like information (finite sample) consistency and are often similar to other existing proposals like intrinsic priors. Moreover, in normal linear model scenarios, they reproduce the Jeffreys–Zellner–Siow priors exactly. Most importantly, in challenging scenarios such as irregular models and mixture models, DB priors are well defined and very reasonable, whereas alternative proposals are not. We derive approximations to the DB priors as well as Markov chain Monte Carlo and asymptotic expressions for the associated Bayes factors. |
Databáze: | OpenAIRE |
Externí odkaz: |