Information Measures, Inequalities and Performance Bounds for Parameter Estimation in Impulsive Noise Environments
Autor: | Jihad Fahs, Ibrahim Abou-Faycal |
---|---|
Rok vydání: | 2018 |
Předmět: |
FOS: Computer and information sciences
Noise measurement Mean squared error Estimation theory Information Theory (cs.IT) Computer Science - Information Theory Gaussian Estimator 020206 networking & telecommunications 02 engineering and technology Library and Information Sciences 01 natural sciences Computer Science Applications Differential entropy 010104 statistics & probability Channel capacity symbols.namesake 0202 electrical engineering electronic engineering information engineering symbols Applied mathematics 0101 mathematics Fisher information Information Systems Mathematics |
Zdroj: | IEEE Transactions on Information Theory. 64:1825-1844 |
ISSN: | 1557-9654 0018-9448 |
DOI: | 10.1109/tit.2017.2785379 |
Popis: | Recent studies found that many channels are affected by additive noise that is impulsive in nature and is best explained by heavy-tailed symmetric alpha-stable distributions. Dealing with impulsive noise environments comes with an added complexity with respect to the standard Gaussian environment: the alpha-stable probability density functions have an infinite second moment and the "nice" Hilbert space structure of the space of random variables having a finite second moment is lost along with its tools and methodologies. This is indeed the case in estimation theory where classical tools to quantify performance of an estimator are tightly related to the assumption of finite variance variables. In alpha-stable environments, expressions such as the mean square error and the Cramer-Rao bound are hence problematic. In this work, we tackle the parameter estimation problem in impulsive noise environments and develop novel tools that are tailored to the alpha-stable and heavy-tailed noise environments, tools that coincide with the standard ones adopted in the Gaussian setup, namely a generalized "power" measure and a generalized Fisher information. We generalize known information inequalities commonly used in the Gaussian context: the de Bruijn's identity, the data processing inequality, the Fisher information inequality, the isoperimetric inequality for entropies and the Cramer-Rao bound. Additionally, we derive upper bounds on the differential entropy of independent sums having a stable component. Finally, the new "power" measure is used to shed some light on the additive alpha-stable noise channel capacity in a setup that generalizes the linear average power constrained AWGN channel. Our theoretical findings are paralleled with numerical evaluations of various quantities and bounds using developed {\em Matlab} packages. 42 pages, 5 figures, submitted to the IEEE transactions on information theory for peer review |
Databáze: | OpenAIRE |
Externí odkaz: |