Zobrazeno 1 - 10
of 138
pro vyhledávání: '"Rasch, Malte J."'
Given the high economic and environmental costs of using large vision or language models, analog in-memory accelerators present a promising solution for energy-efficient AI. While inference on analog accelerators has been studied recently, the traini
Externí odkaz:
http://arxiv.org/abs/2406.12774
Autor:
Gallo, Manuel Le, Lammie, Corey, Buechel, Julian, Carta, Fabio, Fagbohungbe, Omobayode, Mackin, Charles, Tsai, Hsinyu, Narayanan, Vijay, Sebastian, Abu, Maghraoui, Kaoutar El, Rasch, Malte J.
Publikováno v:
APL Machine Learning (2023) 1 (4): 041102
Analog In-Memory Computing (AIMC) is a promising approach to reduce the latency and energy consumption of Deep Neural Network (DNN) inference and training. However, the noisy and non-linear device characteristics, and the non-ideal peripheral circuit
Externí odkaz:
http://arxiv.org/abs/2307.09357
In-memory computing with resistive crossbar arrays has been suggested to accelerate deep-learning workloads in highly efficient manner. To unleash the full potential of in-memory computing, it is desirable to accelerate the training as well as infere
Externí odkaz:
http://arxiv.org/abs/2303.04721
Autor:
Rasch, Malte J., Mackin, Charles, Gallo, Manuel Le, Chen, An, Fasoli, Andrea, Odermatt, Frederic, Li, Ning, Nandakumar, S. R., Narayanan, Pritish, Tsai, Hsinyu, Burr, Geoffrey W., Sebastian, Abu, Narayanan, Vijay
Analog in-memory computing (AIMC) -- a promising approach for energy-efficient acceleration of deep learning workloads -- computes matrix-vector multiplications (MVMs) but only approximately, due to nonidealities that often are non-deterministic or n
Externí odkaz:
http://arxiv.org/abs/2302.08469
Autor:
Rasch, Malte J., Moreda, Diego, Gokmen, Tayfun, Gallo, Manuel Le, Carta, Fabio, Goldberg, Cindy, Maghraoui, Kaoutar El, Sebastian, Abu, Narayanan, Vijay
We introduce the IBM Analog Hardware Acceleration Kit, a new and first of a kind open source toolkit to simulate analog crossbar arrays in a convenient fashion from within PyTorch (freely available at https://github.com/IBM/aihwkit). The toolkit is u
Externí odkaz:
http://arxiv.org/abs/2104.02184
Accelerating training of artificial neural networks (ANN) with analog resistive crossbar arrays is a promising idea. While the concept has been verified on very small ANNs and toy data sets (such as MNIST), more realistically sized ANNs and datasets
Externí odkaz:
http://arxiv.org/abs/1906.02698
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Analog arrays are a promising upcoming hardware technology with the potential to drastically speed up deep learning. Their main advantage is that they compute matrix-vector products in constant time, irrespective of the size of the matrix. However, e
Externí odkaz:
http://arxiv.org/abs/1807.01356
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Gretton, Arthur, Borgwardt, Karsten, Rasch, Malte J., Scholkopf, Bernhard, Smola, Alexander J.
We propose a framework for analyzing and comparing distributions, allowing us to design statistical tests to determine if two samples are drawn from different distributions. Our test statistic is the largest difference in expectations over functions
Externí odkaz:
http://arxiv.org/abs/0805.2368