Distributed regularized stochastic configuration networks via the elastic net

Autor: Shi-Da Zou, Li-Jie Zhao, Ming-Zhong Huang, Guogang Wang
Rok vydání: 2020
Předmět:
Zdroj: Neural Computing and Applications. 33:3281-3297
ISSN: 1433-3058
0941-0643
Popis: Stochastic configuration network (SCN) has great potential in developing fast learning model with sound generalization capability and can be easily extended to the distributed computing framework. This paper aims to develop a distributed regularized stochastic configuration network to solve the limitations of traditional centralized learning on the scalability and efficiency in computing and storage resources for massive datasets. The local models are constructed using a classical stochastic configuration network, and the global unified model is built by the alternating direction method of multipliers (ADMM). Elastic net regularization term combining the LASSO and ridge methods is added into loss function of the ADMM optimization to prevent the model from overfitting when the data has high-dimensional collinearity. Each layer of the local regularized SCN model of a node in the topology network is constructed incrementally; its input weights and biases are broadcast to all other nodes under the inequality constraints. Output weights and the Lagrange multipliers of each node are calculated alternately through the decomposition–coordination procedure of the ADMM optimization algorithm until it finally converges to a unified model. A comprehensive study on five benchmark datasets and the ball mill experimental data has been carried out to evaluate the proposed method. The experiment results show that the proposed distributed regularized stochastic configuration network has relative advantages in terms of accuracy and stability compared with the distributed random vector functional link network.
Databáze: OpenAIRE