Combining a global SVM and local nearest-neighbor classifiers driven by local discriminative boundaries
Autor: | Sim Heng Ong, Wei Xiong, Kelvin Weng Chiong Foong, Thi-Hoang-Diem Le, Joo-Hwee Lim, Jiang Liu |
---|---|
Rok vydání: | 2009 |
Předmět: |
Artificial neural network
business.industry Pattern recognition Machine learning computer.software_genre k-nearest neighbors algorithm Support vector machine Random subspace method Statistical classification ComputingMethodologies_PATTERNRECOGNITION Kernel method Discriminative model Artificial intelligence business computer Curse of dimensionality Mathematics |
Zdroj: | 2009 4th IEEE Conference on Industrial Electronics and Applications. |
DOI: | 10.1109/iciea.2009.5138876 |
Popis: | Nonlinear support vector machines (SVMs) rely on the kernel trick and tradeoff parameters to build nonlinear models to classify complex problems and balance misclassification and generalization. The inconvenience in determining the kernel and the parameters has motivated the use of local nearest neighbor (NN) classifiers in lieu of global classifiers. This substitution ignores the advantage of SVM in global error minimization. On the other hand, the NN rule assumes that class conditional probabilities are locally constant. Such an assumption does not hold near class boundaries and in any high dimensional space due to the curse of dimensionality. We propose a hybrid classification method combining the global SVM and local NN classifiers. Local classifiers occur only when the global SVM is likely to fail. Furthermore, local NN classifiers adopt an adaptive metric driven by local SVM discriminative boundaries. Improved performance has been demonstrated compared to partially similar. |
Databáze: | OpenAIRE |
Externí odkaz: |