Estimation of Mutual Information in Two-Class Pattern Recognition
Autor: | G. A. Butler, H.B. Ritea |
---|---|
Rok vydání: | 1974 |
Předmět: |
business.industry
Conditional mutual information Estimator Pattern recognition Mutual information Joint entropy Theoretical Computer Science Entropy estimation Computational Theory and Mathematics Hardware and Architecture Statistics Maximum entropy probability distribution Entropy (information theory) Artificial intelligence business Class variable Software Mathematics |
Zdroj: | IEEE Transactions on Computers. :410-420 |
ISSN: | 0018-9340 |
DOI: | 10.1109/t-c.1974.223956 |
Popis: | Although mutual information (MI) has been proposed for some time as a measure of the dependence between the class variable and pattern recognition features, it is only recently that the practical problems of designing computer programs to use MI have been raised. Within the two-class context, this paper compares two traditional approaches to the requisite entropy estimation (using the maximum likelihood and expected value estimators of class probabilities) with a new estimator: the expected value of binomial entropy (E). The latter is shown to be superior where one class has a priori dominance. E is also related to expected probability of error and, in a surprising result, it is shown that E is a better estimator of class probabilities than the maximum likelihood and expected value estimators over a wide range. |
Databáze: | OpenAIRE |
Externí odkaz: |