Autor: |
Seyed Jalaleddin Mousavirad, Khosro Rezaee, Abdulaziz S. Almazyad, Ali Wagdy Mohamed, Davood Zabihzadeh, Mehran Pourvahab, Diego Oliva |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Alexandria Engineering Journal, Vol 109, Iss , Pp 126-143 (2024) |
Druh dokumentu: |
article |
ISSN: |
1110-0168 |
DOI: |
10.1016/j.aej.2024.08.097 |
Popis: |
Despite the effectiveness of deep neural networks, feed-forward neural networks (FFNNs) continue to play a crucial role in many applications, especially when dealing with limited data availability. The primary challenge in FFNNs is determining the optimal weights during the training process, aiming to minimise the disparity between actual and predicted outputs. Although gradient-based techniques like backpropagation (BP) have traditionally been popular for FFNN training, they come with inherent limitations, such as sensitivity to initial weights and susceptibility to getting trapped in local optima. To overcome these challenges, we introduce a novel approach based on the Gaining-Sharing Knowledge-based(GSK) algorithm. To the best of our knowledge, this paper represents the first exploration of GSK for neural network training. After obtaining the appropriate weights for the FFNN by the GSK, the weights and biases are utilised to initialise a Levenberg–Marquardt backpropagation (LMBP) algorithm, serving as a local search component. In other words, our proposed algorithm, GSK-LocS, leverages the global search capabilities of the GSK algorithm and combines them with the local search capabilities of LMBP for neural network training. This integration mitigates sensitivity to initial values and reduces the risk of being trapped in local optima. Experimental results conducted on classification and approximation problems provide compelling evidence that our proposed algorithm is highly competitive compared to other existing methods. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|