New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation
Autor: | Jonathan H. Manton, Pavel Tolmachev |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Computer science 02 engineering and technology Hopfield network 03 medical and health sciences 0302 clinical medicine Robustness (computer science) 0202 electrical engineering electronic engineering information engineering medicine Neural and Evolutionary Computing (cs.NE) Hardware architecture Artificial neural network business.industry Computer Science - Neural and Evolutionary Computing Content-addressable memory Hebbian theory medicine.anatomical_structure FOS: Biological sciences Quantitative Biology - Neurons and Cognition Task analysis Neurons and Cognition (q-bio.NC) 020201 artificial intelligence & image processing Neuron Artificial intelligence Gradient descent business 030217 neurology & neurosurgery |
Zdroj: | IJCNN |
Popis: | Hopfield neural networks are a possible basis for modelling associative memory in living organisms. After summarising previous studies in the field, we take a new look at learning rules, exhibiting them as descent-type algorithms for various cost functions. We also propose several new cost functions suitable for learning. We discuss the role of biases (the external inputs) in the learning process in Hopfield networks. Furthermore, we apply Newtons method for learning memories, and experimentally compare the performances of various learning rules. Finally, to add to the debate whether allowing connections of a neuron to itself enhances memory capacity, we numerically investigate the effects of self coupling. Keywords: Hopfield Networks, associative memory, content addressable memory, learning rules, gradient descent, attractor networks 8 pages, IEEE-Xplore, 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow |
Databáze: | OpenAIRE |
Externí odkaz: |