Regularized Recurrent Least Squares Support Vector Machines
Autor: | Yacine Oussar, Weisheng Xu, Haini Qu, Gérard Dreyfus |
---|---|
Rok vydání: | 2009 |
Předmět: |
Artificial neural network
business.industry Open problem Regularization perspectives on support vector machines Machine learning computer.software_genre Regularization (mathematics) Relevance vector machine Support vector machine Margin classifier Least squares support vector machine Artificial intelligence business computer Mathematics |
Zdroj: | IJCBS |
DOI: | 10.1109/ijcbs.2009.58 |
Popis: | Support vector machines are widely used for classification and regression tasks. They provide reliable static models, but their extension to the training of dynamic models is still an open problem. In the present paper, we describe Regularized Recurrent Support Vector Machines, which, in contrast to previous Recurrent Support Vector Machine, models, allow the design of dynamical models while retaining the built-in regularization mechanism present in Support Vector Machines. The principle is validated on academic examples, it is shown that the results compare favorably to those obtained by unregularized Recurrent Support Vector Machines and to regularized, partially recurrent Support Vector Machines. |
Databáze: | OpenAIRE |
Externí odkaz: |