Autor: |
Hyungyo Kim, Joon Hwang, Dongseok Kwon, Jangsaeng Kim, Min-Kyu Park, Jiseong Im, Byung-Gook Park, Jong-Ho Lee |
Jazyk: |
angličtina |
Rok vydání: |
2021 |
Předmět: |
|
Zdroj: |
Advanced Intelligent Systems, Vol 3, Iss 8, Pp n/a-n/a (2021) |
Druh dokumentu: |
article |
ISSN: |
2640-4567 |
DOI: |
10.1002/aisy.202100064 |
Popis: |
On‐chip training of neural networks (NNs) is regarded as a promising training method for neuromorphic systems with analog synaptic devices. Herein, a novel on‐chip training method called direct gradient calculation (DGC) is proposed to substitute conventional backpropagation (BP). In this method, the gradients of a cost function with respect to the weights are calculated directly by sequentially applying a small temporal change to each weight and then measuring the change in cost value. DGC achieves a similar accuracy to that of BP while performing a handwritten digit classification task, validating its training feasibility. In particular, DGC can be applied to analog hardware‐based convolutional NNs (CNNs), which is considered to be a challenging task, enabling appropriate on‐chip training. A hybrid method is also proposed that efficiently combines DGC and BP for training CNNs, and the method achieves a similar accuracy to that of BP and DGC while enhancing the training speed. Furthermore, networks utilizing DGC maintain a higher level of accuracy than those using BP in the presence of variations in hardware (such as synaptic device conductance and neuron circuit component variations) while requiring fewer circuit components. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|