Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Thomas Bohnstingl"'
Publikováno v:
Frontiers in Neuroscience, Vol 13 (2019)
Hyperparameters and learning algorithms for neuromorphic hardware are usually chosen by hand to suit a particular task. In contrast, networks of neurons in the brain were optimized through extensive evolutionary and developmental processes to work we
Externí odkaz:
https://doaj.org/article/b996fe9dd0094fbe939887d648399b30
Publikováno v:
IEEE Transactions on Neural Networks and Learning Systems. :1-15
Biological neural networks are equipped with an inherent capability to continuously adapt through online learning. This aspect remains in stark contrast to learning with error backpropagation through time (BPTT) that involves offline computation of t
Publikováno v:
Nature Machine Intelligence. 2:325-336
Spiking neural networks (SNNs) incorporating biologically plausible neurons hold great promise because of their unique temporal dynamics and energy efficiency. However, SNNs have developed separately from artificial neural networks (ANNs), limiting t
Autor:
Robert L. Bruce, Jin-Ping Han, I. Ok, Abu Sebastian, Geoffrey W. Burr, John M. Papalia, Hsinyu Tsai, Vijay Narayanan, Lynne Gignac, Katie Spoon, Tenko Yamashita, Nicole Saulnier, S. R. Nandakumar, Cheng-Wei Cheng, Andrew H. Simon, Benedikt Kersting, Charles Mackin, Irem Boybat, Stefano Ambrogio, Kevin W. Brew, Matthew J. BrightSky, Ning Li, M. Le Gallo, Praneet Adusumilli, Saraf Iqbal Rashid, Timothy M. Philip, Wanki Kim, Zuoguang Liu, Thomas Bohnstingl, S. Ghazi Sarwat, Nanbo Gong
Publikováno v:
IRPS
Phase change memory (PCM) is rapidly emerging as a promising candidate for building non-von Neumann accelerators for deep neural networks (DNN) based on in-memory computing. However, conductance drift and noise are key challenges for the reliable sto
Publikováno v:
ICECS
Biologically-inspired spiking neural networks (SNNs) hold great promise to perform demanding tasks in an energy and area-efficient manner. Memristive devices organized in a crossbar array can be used to accelerate operations of artificial neural netw
Publikováno v:
Frontiers in Neuroscience
Frontiers in Neuroscience, Vol 13 (2019)
Frontiers in Neuroscience, Vol 13 (2019)
Hyperparameters and learning algorithms for neuromorphic hardware are usually chosen by hand. In contrast, the hyperparameters and learning algorithms of networks of neurons in the brain, which they aim to emulate, have been optimized through extensi
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::a51ad2ea18cdfbbd9f2ec33cdb69902c