Vortex search optimization algorithm for training of feed-forward neural network
Autor: | Tahir Sağ, Zainab Abdullah Jalil Jalil |
---|---|
Rok vydání: | 2021 |
Předmět: |
0209 industrial biotechnology
Artificial neural network Computer science Supervised learning Particle swarm optimization Computational intelligence 02 engineering and technology 020901 industrial engineering & automation Stochastic gradient descent Artificial Intelligence Simulated annealing Genetic algorithm 0202 electrical engineering electronic engineering information engineering Feedforward neural network 020201 artificial intelligence & image processing Computer Vision and Pattern Recognition Algorithm Software |
Zdroj: | International Journal of Machine Learning and Cybernetics. 12:1517-1544 |
ISSN: | 1868-808X 1868-8071 |
Popis: | Training of feed-forward neural-networks (FNN) is a challenging nonlinear task in supervised learning systems. Further, derivative learning-based methods are frequently inadequate for the training phase and cause a high computational complexity due to the numerous weight values that need to be tuned. In this study, training of neural-networks is considered as an optimization process and the best values of weights and biases in the structure of FNN are determined by Vortex Search (VS) algorithm. The VS algorithm is a novel metaheuristic optimization method recently developed, inspired by the vortex shape of stirred liquids. VS fulfills the training task to set the optimal weights and biases stated in a matrix. In this context, the proposed VS-based learning method for FNNs (VS-FNN) is conducted to analyze the effectiveness of the VS algorithm in FNN training for the first time in the literature. The proposed method is applied to six datasets whose names are 3-bit XOR, Iris Classification, Wine-Recognition, Wisconsin-Breast-Cancer, Pima-Indians-Diabetes, and Thyroid-Disease. The performance of the proposed algorithm is analyzed by comparing with other training methods based on Artificial Bee Colony Optimization (ABC), Particle Swarm Optimization (PSO), Simulated Annealing (SA), Genetic Algorithm (GA) and Stochastic Gradient Descent (SGD) algorithms. The experimental results show that VS-FNN is generally leading and competitive. It is also said that VS-FNN can be used as a capable tool for neural networks. |
Databáze: | OpenAIRE |
Externí odkaz: |