Intrinsic Grassmann Averages for Online Linear, Robust and Nonlinear Subspace Learning

Autor: Baba C. Vemuri, Søren Hauberg, Rudrasis Chakraborty, Liu Yang
Rok vydání: 2021
Předmět:
Zdroj: IEEE Transactions on Pattern Analysis and Machine Intelligence. 43:3904-3917
ISSN: 1939-3539
0162-8828
DOI: 10.1109/tpami.2020.2992392
Popis: Principal component analysis (PCA) and Kernel principal component analysis (KPCA) are fundamental methods in machine learning for dimensionality reduction. The former is a technique for finding this approximation in finite dimensions and the latter is often in an infinite dimensional reproducing Kernel Hilbert-space (RKHS). In this paper, we present a geometric framework for computing the principal linear subspaces in both (finite and infinite) situations as well as for the robust PCA case, that amounts to computing the intrinsic average on the space of all subspaces: the Grassmann manifold. Points on this manifold are defined as the subspaces spanned by $K$ K -tuples of observations. The intrinsic Grassmann average of these subspaces are shown to coincide with the principal components of the observations when they are drawn from a Gaussian distribution. We show similar results in the RKHS case and provide an efficient algorithm for computing the projection onto the this average subspace. The result is a method akin to KPCA which is substantially faster. Further, we present a novel online version of the KPCA using our geometric framework. Competitive performance of all our algorithms are demonstrated on a variety of real and synthetic data sets.
Databáze: OpenAIRE