Nonparametric Statistics: Advanced Computational Methods
Autor: | Jayaram Sethuraman, Myles Hollander |
---|---|
Rok vydání: | 2015 |
Předmět: |
business.industry
Bayesian probability Nonparametric statistics Inference Machine learning computer.software_genre Bayesian statistics symbols.namesake Statistical inference symbols Probability distribution Artificial intelligence business computer Algorithm Mathematics Parametric statistics Gibbs sampling |
DOI: | 10.1016/b978-0-08-097086-8.42155-0 |
Popis: | Statistical methods are useful in obtaining information about the unknown state of nature or the ‘parameter’ as it is usually referred to in the literature. A statistician collects suitable data whose distribution depends on the unknown parameter. A statistical inference procedure is then devised to produce information about the unknown parameter or a function of that parameter. The classical methods of inference assume that the probability distributions that govern the data depend only on a few unknown quantities (or parameters). Such procedures are called parametric procedures. When one is not able to make such strong assumptions about the probability models that govern the data, and/or when one can be more sure of the rankings among the data than the exact values, one uses robust inference procedures known as nonparametric methods. Tests and estimates based on the simpler nonparametric methods can be obtained from easy computations based on the data. Procedures that have validity under more general models will require more heavy computations. When there is some uncertainty about the probability models and/or when expert information concerning the problem at hand is available, one should use Bayesian methods. These methods can also tend to be computationally intensive. We describe briefly the main ingredients of some new Bayesian and computational methods in nonparametric inference. These include bootstrapping and Gibbs sampling. |
Databáze: | OpenAIRE |
Externí odkaz: |