Spin‐Transfer‐Torque Magnetic Tunnel Junction Nonlinear In‐Sensor Computing Synapse for Improving the Performance of the Feedforward Neural Network

Autor: Minhui Ji, Jiayuan Wang, Liyuan Yang, Xinmiao Zhang, Yueguo Hu, Qingfa Du, Jiafei Hu, Weicheng Qiu, Junping Peng, Xiaowen Chen, Yanxiang Luo, Bin Fang, Peisen Li, Mengchun Pan
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Advanced Intelligent Systems, Vol 6, Iss 6, Pp n/a-n/a (2024)
Druh dokumentu: article
ISSN: 2640-4567
DOI: 10.1002/aisy.202300742
Popis: In‐sensor computing architecture has a great advantage especially in massive data sampling, transfer, and processing compared with the separated intelligent sensor systems. However, most of the in‐sensor computing device is proposed based on the traditional neural network model, where the synapse performs linear multiplication of input and weight. This approach fails to make the most use of the nonlinearity of in‐sensor computing devices. Therefore, in this article, first a modified feedforward neural network model with the nonlinear in‐sensor computing synapse (NSCS) located at the input layer is presented, and the backpropagation (BP) algorithm is modified to train the network. Then, the nonlinear characteristics of the NSCS composed of the spin‐transfer‐torque magnetic tunnel junction (STT–MTJ) devices and simple complementary metal‐oxide‐semiconductor (CMOS) circuit are analyzed. Based on the nonlinear response of STT–MTJ NSCS, the small‐scale network with NSCS synapse is experimented on the Modified National Institute of Standards and Technology dataset and compared with the traditional network of the same network size. In the simulation result, it is shown that better performance can be achieved with the STT–MTJ NSCS, including a 2–15 times improvement in convergence speed and a 2.5%–5.1% increase in accuracy.
Databáze: Directory of Open Access Journals