Popis: |
In this article we present a new class of support vector machines for binary classification task. Our support vector machines are constructed using only two support vectors and have very low Vapnik-Chervonenkis dimension, so they generalize well. Geometrically, our approach is based on searching of a proper pair of observations from different classes of explained variable. Once this pair is found the discriminant hyperplane becomes orthogonal to the line connecting these observations. This method deals well with data sets with large number of features and small number of observations like gene expression data. We illustrate the performance of our classification method using gene expression data and show that it is superior to other classifiers especially to diagonal linear discriminant analysis and k-nearest neighbor which achieved the lowest error rate in the previous studies of tumor classification. |