Prediction with the SVM Using Test Point Margins

Autor: Sureyya Ozogur-Akyuz, John Shawe-Taylor, Zakria Hussain
Rok vydání: 2009
Předmět:
Zdroj: Annals of Information Systems ISBN: 9781441912794
Data Mining
DOI: 10.1007/978-1-4419-1280-0_7
Popis: Support vector machines (SVMs) carry out binary classification by constructing a maximal margin hyperplane between the two classes of observed (training) examples and then classifying test points according to the half-spaces in which they reside (irrespective of the distances that may exist between the test examples and the hyperplane). Cross-validation involves finding the one SVM model together with its optimal parameters that minimizes the training error and has good generalization in the future. In contrast, in this chapter we collect all of the models found in the model selection phase and make predictions according to the model whose hyperplane achieves the maximum separation from a test point. This directly corresponds to the L ∞ norm for choosing SVM models at the testing stage. Furthermore, we also investigate other more general techniques corresponding to different L p norms and show how these methods allow us to avoid the complex and timeconsuming paradigm of cross-validation. Experimental results demonstrate this advantage, showing significant decreases in computational time as well as competitive generalization error.
Databáze: OpenAIRE