Autor: |
Yaoqi Hou, Qingtian Zhang, Namin Wang, Huaqiang Wu |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Machine Learning: Science and Technology, Vol 5, Iss 3, p 035065 (2024) |
Druh dokumentu: |
article |
ISSN: |
2632-2153 |
DOI: |
10.1088/2632-2153/ad734a |
Popis: |
As an emerging computing architecture, the computing-in-memory (CIM) exhibits significant potential for energy efficiency and computing power in artificial intelligence applications. However, the intrinsic non-idealities of CIM devices, manifesting as random interference on the weights of neural network, may significantly impact the inference accuracy. In this paper, we propose a novel training algorithm designed to mitigate the impact of weight noise. The algorithm strategically minimizes cross-entropy loss while concurrently refining the feature representations in intermediate layers to emulate those of an ideal, noise-free network. This dual-objective approach not only preserves the accuracy of the neural network but also enhances its robustness against noise-induced degradation. Empirical validation across several benchmark datasets confirms that our algorithm sets a new benchmark for accuracy in CIM-enabled neural network applications. Compared to the most commonly used forward noise training methods, our approach yields approximately a 2% accuracy boost on the ResNet32 model with the CIFAR-10 dataset and a weight noise scale of 0.2, and achieves a minimum performance gain of 1% on ResNet18 with the ImageNet dataset under the same noise quantization conditions. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|