Minimum Variance Estimation of a Sparse Vector within the Linear Gaussian Model: An RKHS Approach

Autor: Alexander Jung, Yonina C. Eldar, Zvika Ben-Haim, Sebastian Schmutzhard, Franz Hlawatsch
Jazyk: angličtina
Rok vydání: 2013
Předmět:
Popis: We consider minimum variance estimation within the sparse linear Gaussian model (SLGM). A sparse vector is to be estimated from a linearly transformed version embedded in Gaussian noise. Our analysis is based on the theory of reproducing kernel Hilbert spaces (RKHS). After a characterization of the RKHS associated with the SLGM, we derive novel lower bounds on the minimum variance achievable by estimators with a prescribed bias function. This includes the important case of unbiased estimation. The variance bounds are obtained via an orthogonal projection of the prescribed mean function onto a subspace of the RKHS associated with the SLGM. Furthermore, we specialize our bounds to compressed sensing measurement matrices and express them in terms of the restricted isometry and coherence parameters. For the special case of the SLGM given by the sparse signal in noise model (SSNM), we derive closed-form expressions of the minimum achievable variance (Barankin bound) and the corresponding locally minimum variance estimator. We also analyze the effects of exact and approximate sparsity information and show that the minimum achievable variance for exact sparsity is not a limiting case of that for approximate sparsity. Finally, we compare our bounds with the variance of three well-known estimators, namely, the maximum-likelihood estimator, the hard-thresholding estimator, and compressive reconstruction using the orthogonal matching pursuit.
Databáze: OpenAIRE