A Comparison of Two Methods for Obtaining a Collective Posterior Distribution

Autor: Rafael Catoia Pulgrossi, Rafael Izbicki, Natalia L. Oliveira, Adriano Polpo
Rok vydání: 2018
Předmět:
Zdroj: Springer Proceedings in Mathematics & Statistics ISBN: 9783319911427
DOI: 10.1007/978-3-319-91143-4_21
Popis: Bayesian inference is a powerful method that allows individuals to update their knowledge about any phenomenon when more information about it becomes available. In this paradigm, before data is observed, an individual expresses his uncertainty about the phenomenon of interest through a prior probability distribution. Then, after data is observed, this distribution is updated using Bayes theorem. In many situations, however, one desires to evaluate the knowledge of a group rather than of a single individual. In this case, a way to combine information from different sources is by mixing their uncertainty. The mixture can be done in two ways: before or after the data is observed. Although in both cases, we achieve a collective posterior distribution, they can be substantially different. In this work, we present several comparisons between these two approaches with noninformative priors and use the Kullback–Leibler’s divergence to quantify the amount of information that is gained by each collective distribution.
Databáze: OpenAIRE