Popis: |
Singular value decomposition (SVD) and principal component analysis enjoy a broad range of applications, including, rank estimation, noise reduction, classification and compression. The resulting singular vectors form orthogonal basis sets for subspace projection techniques. The procedures are applicable to general data matrices. Spectral matrices belong to a special class known as non-negative matrices. A key property of non-negative matrices is that their columns/rows form non-negative cones, with any non-negative linear combination of the columns/rows belonging to the cone. This special property has been implicitly used in popular rank estimation techniques know as virtual dimension (VD) and hyperspectral signal identification by minimum error (HySime). Data sets of spectra reside in non-negative orthants. The subspace spanned by a SVD of a set of spectra includes all orthants. However SVD projections can be constrained to the non-negative orthants. In this paper two types of singular vector projection constraints are identified, one that confines the projection to lie within the cone formed by the spectral data set, and a second that only restricts projections to the non-negative orthant. The former is referred to here as the inner constraint set, the latter the outer constraint set. The outer constraint set forms a broader cone since it includes projections outside the cone formed by the data array. The two cones form boundaries for the cones formed by non-negative matrix factorizations (NNF). Ambiguities in the NNF lead to a variety of possible sets of left and right non-negative vectors and their cones. The paper presents the constraint set approach and illustrates it with applications to spectral classification. |