Optimasi Learning Rate Neural Network Backpropagation Dengan Search Direction Conjugate Gradient Pada Electrocardiogram

Autor: Lukman Hakim, Vivi Aida Fitria, Azwar Riza Habibi
Rok vydání: 2020
Zdroj: NUMERICAL: Jurnal Matematika dan Pendidikan Matematika. :131-137
ISSN: 2580-2437
2580-3573
DOI: 10.25217/numerical.v3i2.603
Popis: This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this method is in defining the direction of linear search. The conjugate gradient method has several methods to determine the steep size such as the Fletcher-Reeves, Dixon, Polak-Ribere, Hestene Steifel, and Dai-Yuan methods by using discrete electrocardiogram data. Conjugate gradients are used to update learning rates on neural networks by using different steep sizes. While the gradient search direction is used to update the weight on the NN. The results show that using Polak-Ribere get an optimal error, but the direction of the weighting search on NN widens and causes epoch on NN training is getting longer. But Hestene Steifel, and Dai-Yua could not find the gradient search direction so they could not update the weights and cause errors and epochs to infinity.
Databáze: OpenAIRE