Manifold-preserving graph reduction for sparse semi-supervised learning
Autor: | Shiliang Sun, Zakria Hussain, John Shawe-Taylor |
---|---|
Rok vydání: | 2014 |
Předmět: |
Dense graph
Theoretical computer science Computer science Cognitive Neuroscience Semi-supervised learning Degeneracy (graph theory) Regularization (mathematics) Artificial Intelligence Graph power Rademacher complexity Complement graph Voltage graph Nonlinear dimensionality reduction Sparse approximation 1-planar graph Graph Manifold Computer Science Applications Vertex (geometry) Support vector machine ComputingMethodologies_PATTERNRECOGNITION Graph bandwidth Outlier Graph (abstract data type) Level structure Null graph Laplace operator |
Zdroj: | Neurocomputing. 124:13-21 |
ISSN: | 0925-2312 |
DOI: | 10.1016/j.neucom.2012.08.070 |
Popis: | Representing manifolds using fewer examples has the advantages of eliminating the influence of outliers and noisy points and simultaneously accelerating the evaluation of predictors learned from the manifolds. In this paper, we give the definition of manifold-preserving sparse graphs as a representation of sparsified manifolds and present a simple and efficient manifold-preserving graph reduction algorithm. To characterize the manifold-preserving properties, we derive a bound on the expected connectivity between a randomly picked point outside of a sparse graph and its closest vertex in the sparse graph. We also bound the approximation ratio of the proposed graph reduction algorithm. Moreover, we apply manifold-preserving sparse graphs to semi-supervised learning and propose sparse Laplacian support vector machines (SVMs). After characterizing the empirical Rademacher complexity of the function class induced by the sparse Laplacian SVMs, which is closely related to their generalization errors, we further report experimental results on multiple data sets which indicate their feasibility for classification. |
Databáze: | OpenAIRE |
Externí odkaz: |