Autor: |
Huang, X, Suykens, Johan, Wang, S, Maier, A, Hornegger, J |
Jazyk: |
angličtina |
Rok vydání: |
2018 |
Předmět: |
|
Popis: |
This brief proposes a truncated distance (TL1) kernel, which results in a classifier that is nonlinear in the global region but is linear in each subregion. With this kernel, the subregion structure can be trained using all the training data and local linear classifiers can be established simultaneously. The TL1 kernel has good adaptiveness to nonlinearity and is suitable for problems which require different nonlinearities in different areas. Though the TL1 kernel is not positive semidefinite, some classical kernel learning methods are still applicable which means that the TL1 kernel can be directly used in standard toolboxes by replacing the kernel evaluation. In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying the TL1 kernel a promising nonlinear kernel for classification tasks. ispartof: IEEE Transactions on Neural Networks and Learning Systems vol:29 issue:5 pages:2025-2030 ispartof: location:United States status: published |
Databáze: |
OpenAIRE |
Externí odkaz: |
|