Autor: |
Yasuo Sakata, Katsuki Katayama, Tsuyoshi Horiguchi |
Rok vydání: |
2002 |
Předmět: |
|
Zdroj: |
Physica A: Statistical Mechanics and its Applications. 310:532-546 |
ISSN: |
0378-4371 |
DOI: |
10.1016/s0378-4371(02)00785-9 |
Popis: |
We investigate storage capacity of two types of fully connected layered neural networks with sparse coding when binary patterns are embedded into the networks by a Hebbian learning rule. One of them is a layered network, in which a transfer function of even layers is different from that of odd layers. The other is a layered network with intra-layer connections, in which the transfer function of inter-layer is different from that of intra-layer, and inter-layered neurons and intra-layered neurons are updated alternately. We derive recursion relations for order parameters by means of the signal-to-noise ratio method, and then apply the self-control threshold method proposed by Dominguez and Bolle to both layered networks with monotonic transfer functions. We find that a critical value αC of storage capacity is about 0.11|a ln a| −1 (a⪡1) for both layered networks, where a is a neuronal activity. It turns out that the basin of attraction is larger for both layered networks when the self-control threshold method is applied. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|