Autor: |
Alejo Mosso Vázquez, David Juárez-Romero, José Alfredo Hernández-Pérez, Darvi Echeverría Sosa, Jimer Emir Loría Yah, Ramiro José González Horta, Gerardo Israel de Atocha Pech Carveo, Carlos Alberto Decena Chan |
Jazyk: |
English<br />Spanish; Castilian |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
Programación Matemática y Software, Vol 15, Iss 1 (2023) |
Druh dokumentu: |
article |
ISSN: |
2007-3283 |
DOI: |
10.30973/progmat/2023.15.1/5 |
Popis: |
This paper explores the fundamentals of Deep Learning by searching a simple Neural Network model of the XOR function for the forward and backward signals flowing through this model. Our purpose is to reach a deeper understanding of some outstanding concepts of Deep Learning, which would enable us to get the significance of it while the Neural Network model of the XOR function is trained by the backpropagation algorithm. The chosen Neural Network model contains just one hidden layer with four neurons and an output layer with one neuron. Although this model is not a deep neural network, its hidden layer carries the enough concepts of Deep Learning. The sigmoid is used as the activation function in all neurons. A derivation of a simple version of the Stochastic Gradient Descent algorithm is presented, which is used to minimize the output error, and then by backpropagating it we come to the backpropagation algorithm. Numerical results are presented, which shows the convergence of the output error and that of a selected weight and their analysis summarize the understanding of the fundamental concepts of Deep Learning. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|