Effects of Mismatched Training on Adaptive Detection

Autor: R. S. Raghavan
Rok vydání: 2018
Předmět:
Zdroj: ACSSC
DOI: 10.1109/acssc.2018.8645092
Popis: Interference cancelation in the adaptive radar detection context typically relies on training samples to estimate the covariance matrix of interference and noise in the test vector. Adaptive detection algorithms are generally developed under the assumption that the interference-plus-noise covariance matrix of the test vector (say C) is the same as the interference-plus-noise covariance matrix of the training vectors (say Σ). When the two covariance matrices are not perfectly matched the constant false alarm rate (CFAR) feature of adaptive detectors is no longer valid. For mismatched conditions, standard scalar CFAR techniques can be applied on adaptive detector outputs to regain the CFAR feature. In this paper we consider the Adaptive Matched Filter (AMF) statistic based CFAR detector and shown that the effects of covariance matrix mismatch can be condensed into a single scalar quantity referred to as the loss factor ρ. The loss factor is a random variable if the estimate of Σ is a random matrix. Sample results are provided for the deterministic case.
Databáze: OpenAIRE