Direct Estimation of Information Divergence Using Nearest Neighbor Ratios

Autor: Noshad, Morteza, Moon, Kevin R., Sekeh, Salimeh Yasaei, Hero III, Alfred O.
Rok vydání: 2017
Předmět:
Zdroj: In Information Theory (ISIT), 2017 IEEE International Symposium on (pp. 903-907). IEEE
Druh dokumentu: Working Paper
DOI: 10.1109/ISIT.2017.8006659
Popis: We propose a direct estimation method for R\'{e}nyi and f-divergence measures based on a new graph theoretical interpretation. Suppose that we are given two sample sets $X$ and $Y$, respectively with $N$ and $M$ samples, where $\eta:=M/N$ is a constant value. Considering the $k$-nearest neighbor ($k$-NN) graph of $Y$ in the joint data set $(X,Y)$, we show that the average powered ratio of the number of $X$ points to the number of $Y$ points among all $k$-NN points is proportional to R\'{e}nyi divergence of $X$ and $Y$ densities. A similar method can also be used to estimate f-divergence measures. We derive bias and variance rates, and show that for the class of $\gamma$-H\"{o}lder smooth functions, the estimator achieves the MSE rate of $O(N^{-2\gamma/(\gamma+d)})$. Furthermore, by using a weighted ensemble estimation technique, for density functions with continuous and bounded derivatives of up to the order $d$, and some extra conditions at the support set boundary, we derive an ensemble estimator that achieves the parametric MSE rate of $O(1/N)$. Our estimators are more computationally tractable than other competing estimators, which makes them appealing in many practical applications.
Comment: 2017 IEEE International Symposium on Information Theory (ISIT)
Databáze: arXiv