PrecoG: an efficient unitary split preconditioner for the transform-domain LMS filter via graph Laplacian regularization

Autor: Batabyal, Tamal, Weller, Daniel S., Kapur, Jaideep, Acton, Scott T.
Rok vydání: 2018
Předmět:
Druh dokumentu: Working Paper
Popis: Transform-domain least mean squares (LMS) adaptive filters encompass the class of algorithms where the input data are subjected to a data-independent unitary transform followed by a power normalization stage as preprocessing steps. Because conventional transformations are not data-dependent, this preconditioning procedure was shown theoretically to improve the convergence of the LMS filter only for certain classes of input data. However, in reality if the class of input data is not known beforehand, it is difficult to decide which transformation to use. Thus, there is a need to devise a learning framework to obtain such a preconditioning transformation using input data prior to applying on the input data. It is hypothesized that the underlying topology of the data affects the selection of the transformation. With the input modeled as a weighted graph that mimics neuronal interactions, PrecoG obtains the desired transform by recursive estimation of the graph Laplacian matrix. Additionally, we show the efficacy of the transform as a generalized split preconditioner on a linear system of equations and in Hebb-LMS settings. In terms of the improvement of the condition number after applying the transformation, PrecoG performs significantly better than the existing state-of-the-art techniques that involve unitary and non-unitary transforms.
Comment: Preprint, in submission
Databáze: arXiv