New method for solving Ivanov regularization-based support vector machine learning
Autor: | Xiang Xu, Daoli Zhu |
---|---|
Rok vydání: | 2021 |
Předmět: |
Computer Science::Machine Learning
General Computer Science Generalization Computer science Management Science and Operations Research Regularization (mathematics) Support vector machine ComputingMethodologies_PATTERNRECOGNITION Hyperplane Modeling and Simulation Statistical learning theory Structural risk minimization Minification Limit (mathematics) Algorithm |
Zdroj: | Computers & Operations Research. 136:105504 |
ISSN: | 0305-0548 |
Popis: | The support vector machine (SVM) model is one of the most well-known machine learning models, which is based on the structural risk minimization (SRM) principle. The SRM principle, formulated by Vapnik in a statistical learning theory framework, can be naturally expressed as an Ivanov regularization-based SVM (I-SVM). Recent advances in learning theory clearly show that I-SVM allows a more effective control of the learning hypothesis space with a better generalization ability. In this paper, we propose a new method for optimizing the I-SVM to find the optimal separation hyperplane. The proposed approach provides a parallel block minimization framework for solving the dual I-SVM problem that exploits the advantages of the randomized primal–dual coordinate (RPDC) method, and every iteration-based sub-optimization RPDC routine has a simple closed-form. We also provide an upper limit τ ∗ for the space control parameter τ by solving a Morozov regularization SVM (M-SVM) problem. Experimental results confirmed the improved performance of our method for general I-SVM learning problems. |
Databáze: | OpenAIRE |
Externí odkaz: |