An experimental study of the extended NRBF regression model and its enhancement for classification problem
Autor: | Geok See Ng, Abdul Wahab, Sevki S. Erdogan, Lin Ma |
---|---|
Rok vydání: | 2008 |
Předmět: |
Normalization (statistics)
business.industry Computer science Cognitive Neuroscience Bayesian probability Regression analysis Machine learning computer.software_genre Mixture model Computer Science Applications Function approximation Artificial Intelligence Expectation–maximization algorithm Radial basis function Artificial intelligence business Classifier (UML) computer |
Zdroj: | Neurocomputing. 72:458-470 |
ISSN: | 0925-2312 |
DOI: | 10.1016/j.neucom.2007.12.011 |
Popis: | As an extension of the traditional normalized radial basis function (NRBF) model, the extended normalized RBF (ENRBF) model was proposed by Xu [RBF nets, mixture experts, and Bayesian Ying-Yang learning, Neurocomputing 19 (1998) 223-257]. In this paper, we perform a supplementary study on ENRBF with several properly designed experiments and some further theoretical discussions. It is shown that ENRBF is able to efficiently improve the learning accuracies under some circumstances. Moreover, since the ENRBF model is initially proposed for the regression and function approximation problems, a further step is taken in this work to modify the ENRBF model to deal with the classification problems. Both the original ENRBF model and the new proposed ENRBF classifier (ENRBFC) can be viewed as the special cases of the mixture-of-experts (ME) model that is discussed in Xu et al. [An alternative model for mixtures of experts, in: Advances in Neural Information Processing Systems, MIT Press, Cambridge, MA, 1995]. Experimental results show the potentials of ENRBFC compared to some other related classifiers. |
Databáze: | OpenAIRE |
Externí odkaz: |