Autor: |
Gou Saifei, Wang Yin, Dong Xiangqi, Xu Zihan, Wang Xinyu, Sun Qicheng, Xie Yufeng, Zhou Peng, Bao Wenzhong |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
National Science Open, Vol 2 (2023) |
Druh dokumentu: |
article |
ISSN: |
2097-1168 |
DOI: |
10.1360/nso/20220071 |
Popis: |
In-memory computing is an alternative method to effectively accelerate the massive data-computing tasks of artificial intelligence (AI) and break the memory wall. In this work, we propose a 2T1C DRAM structure for in-memory computing. It integrates a monolayer graphene transistor, a monolayer MoS2 transistor, and a capacitor in a two-transistor-one-capacitor (2T1C) configuration. In this structure, the storage node is in a similar position to that of one-transistor-one-capacitor (1T1C) dynamic random-access memory (DRAM), while an additional graphene transistor is used to accomplish the non-destructive readout of the stored information. Furthermore, the ultralow leakage current of the MoS2 transistor enables the storage of multi-level voltages on the capacitor with a long retention time. The stored charges can effectually tune the channel conductance of the graphene transistor due to its excellent linearity so that linear analog multiplication can be realized. Because of the almost unlimited cycling endurance of DRAM, our 2T1C DRAM has great potential for in situ training and recognition, which can significantly improve the recognition accuracy of neural networks. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|