Analysis of the adaptive learning rate and momentum effects on prediction problems in increasing the training time of the backpropagation algorithm.

Autor: Alkhairi, Putrama, Wanayumini, W., Hayadi, B. Herawan
Předmět:
Zdroj: AIP Conference Proceedings; 2024, Vol. 3048 Issue 1, p1-7, 7p
Abstrakt: The backpropagation algorithm is a very popular learning method in feedforward multilayer perceptron networks and has been successfully applied to a variety of practical problems. Considering that this algorithm uses the gradient descent method, it has several weaknesses, namely slow learning speed and easy convergence to the local minimum. The convergence behavior depends on the architecture, choice of initial weights and bias, learning frequency, momentum, and gain value in the activation function. Several improvements and changes have also been reported over the years. Previous studies further showed that in the 'feed forward' algorithm, the slope of the activation function is directly affected by a parameter called 'gain'. Therefore, this study proposes a method to improve the performance of the current backpropagation algorithm, namely the Gradient Descent Method by adaptively changing the momentum coefficient for each node. The effect of adaptive momentum along with gain on the learning ability of neural networks was analyzed, while the multilayer feed-forward neural network was assessed. A physical interpretation of the relationship between momentum values, learning rates, and weight values was obtained. The efficiency of the proposed algorithm compared to the conventional Gradient Descent Method and the current Gradient Descent Method with Adaptive Learning Rate was verified through simulations on three benchmark problems. Based on patterns observed, the simulation results show that the proposed algorithm converges more quickly in the prediction problem of diabetics with a loss ratio of 0.5023 and an accuracy of 0.7461. The results show that the proposed method can significantly improve the learning speed of the current gradient descent backpropagation algorithm. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index