Hebbian learning reconsidered: Representation of static and dynamic objects in associative neural nets.

Autor: Herz, A., Sulzer, B., Kühn, R., Hemmen, J.
Zdroj: Biological Cybernetics; 1989, Vol. 60 Issue 6, p457-467, 11p
Abstrakt: According to Hebb's postulate for learning, information presented to a neural net during a learning session is stored in the synaptic efficacies. Long-term potentiation occurs only if the postsynaptic neuron becomes active in a time window set up by the presynaptic one. We carefully interpret and mathematically implement the Hebb rule so as to handle both stationary and dynamic objects such as single patterns and cycles. Since the natural dynamics contains a rather broad distribution of delays, the key idea is to incorporate these delays in the learning session. As theory and numerical simulation show, the resulting procedure is surprisingly robust and faithful. It also turns out that pure Hebbian learning is by selection: the network produces synaptic representations that are selected according to their resonance with the input percepts. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index