Eigenvectors of Orthogonally Decomposable Functions
Autor: | James Voss, Mikhail Belkin, Luis Rademacher |
---|---|
Rok vydání: | 2018 |
Předmět: |
FOS: Computer and information sciences
Pure mathematics General Computer Science General Mathematics MathematicsofComputing_NUMERICALANALYSIS Structure (category theory) 0102 computer and information sciences Spectral theorem 01 natural sciences Machine Learning (cs.LG) Computer Science - Learning 010201 computation theory & mathematics Generalized eigenvector Symmetric matrix Eigendecomposition of a matrix Eigenvalues and eigenvectors Mathematics |
Zdroj: | SIAM Journal on Computing. 47:547-615 |
ISSN: | 1095-7111 0097-5397 |
DOI: | 10.1137/17m1122013 |
Popis: | The Eigendecomposition of quadratic forms (symmetric matrices) guaranteed by the spectral theorem is a foundational result in applied mathematics. Motivated by a shared structure found in inferential problems of recent interest---namely orthogonal tensor decompositions, Independent Component Analysis (ICA), topic models, spectral clustering, and Gaussian mixture learning---we generalize the eigendecomposition from quadratic forms to a broad class of "orthogonally decomposable" functions. We identify a key role of convexity in our extension, and we generalize two traditional characterizations of eigenvectors: First, the eigenvectors of a quadratic form arise from the optima structure of the quadratic form on the sphere. Second, the eigenvectors are the fixed points of the power iteration. In our setting, we consider a simple first order generalization of the power method which we call gradient iteration. It leads to efficient and easily implementable methods for basis recovery. It includes influential Machine Learning methods such as cumulant-based FastICA and the tensor power iteration for orthogonally decomposable tensors as special cases. We provide a complete theoretical analysis of gradient iteration using the structure theory of discrete dynamical systems to show almost sure convergence and fast (super-linear) convergence rates. The analysis also extends to the case when the observed function is only approximately orthogonally decomposable, with bounds that are polynomial in dimension and other relevant parameters, such as perturbation size. Our perturbation results can be considered as a non-linear version of the classical Davis-Kahan theorem for perturbations of eigenvectors of symmetric matrices. 69 pages |
Databáze: | OpenAIRE |
Externí odkaz: |