A geometric approach to statistical estimation
Autor: | R. Kulhavy |
---|---|
Rok vydání: | 2002 |
Předmět: |
Statistics::Theory
Kullback–Leibler divergence Statistical distance Shannon's source coding theorem Principle of maximum entropy Statistics::Computation Differential entropy Rényi entropy Statistics::Machine Learning Total variation distance of probability measures Statistics Maximum entropy probability distribution Statistics::Methodology Applied mathematics Mathematics |
Zdroj: | Proceedings of 1995 34th IEEE Conference on Decision and Control. |
DOI: | 10.1109/cdc.1995.480237 |
Popis: | The role of Kerridge inaccuracy, Shannon entropy and Kullback-Leibler distance in statistical estimation is shown for both discrete and continuous observations. The cases of data independence and regression-type dependence are considered in parallel. Pythagorean-like relations valid for probability distributions are presented and their importance for estimation under compressed data is indicated. |
Databáze: | OpenAIRE |
Externí odkaz: |