Confronting machine-learning with neuroscience for neuromorphic architectures design
Autor: | Lyes Khacef, Nassim Abderrahmane, Benoit Miramond |
---|---|
Přispěvatelé: | Laboratoire d'Electronique, Antennes et Télécommunications (LEAT), Université Côte d'Azur (UCA)-Université Nice Sophia Antipolis (... - 2019) (UNS), COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Centre National de la Recherche Scientifique (CNRS), IEEE, ANR-15-IDEX-0001,UCA JEDI,Idex UCA JEDI(2015), Bio-inspired systems and circuits, Université Nice Sophia Antipolis (... - 2019) (UNS), COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Centre National de la Recherche Scientifique (CNRS)-Université Côte d'Azur (UCA) |
Jazyk: | angličtina |
Rok vydání: | 2018 |
Předmět: |
Brain modeling
[INFO.INFO-AR]Computer Science [cs]/Hardware Architecture [cs.AR] computing power Computer science power aware computing 02 engineering and technology [INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE] computer.software_genre spiking neural network 0302 clinical medicine graphics processing units 0202 electrical engineering electronic engineering information engineering hardware accelerator Neurons Contextual image classification Artificial neural network machine-learning Biological system modeling power consumption Computational modeling Energy consumption neural nets deep neural networks neuromorphic architectures multilayer perceptrons 020201 artificial intelligence & image processing embedded systems training part artificial neural networks neuromorphic architectures design CPU-GPU embedded implementation Computation Computer Science::Neural and Evolutionary Computation distributed computation paradigm Machine learning 03 medical and health sciences Hardware embedded artificial intelligence machine learning algorithms energy consumption design neuromorphic architectures Computer architecture hardware cost implementation Spiking neural network Quantitative Biology::Neurons and Cognition business.industry artificial intelligence research Perceptron Neuromorphic engineering computation model Hardware acceleration learning (artificial intelligence) Artificial intelligence [INFO.INFO-BI]Computer Science [cs]/Bioinformatics [q-bio.QM] business Biological neural networks computer Neuroscience 030217 neurology & neurosurgery |
Zdroj: | IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, Jul 2018, Rio de Janeiro, Brazil. pp.1-8, ⟨10.1109/IJCNN.2018.8489241⟩ BASE-Bielefeld Academic Search Engine Proceedings of the International Joint Conference on Neural Networks IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, Jul 2018, Rio de Janeiro, Brazil. ⟨10.1109/IJCNN.2018.8489241⟩ IJCNN |
DOI: | 10.1109/IJCNN.2018.8489241⟩ |
Popis: | International audience; Artificial neural networks are experiencing today an unprecedented interest thanks to two main changes: the explosion of open data that is necessary for their training, and the increasing computing power of today's computers that makes the training part possible in a reasonable time. The recent results of deep neural networks on image classification has given neural networks the leading role in machine learning algorithms and artificial intelligence research.However, most applications such as smart devices or autonomous vehicles require an embedded implementation of neural networks. Their implementation in CPU/GPU remains too expensive, mostly in energy consumption, due to the non-adaptation of the hardware to the computation model, which becomes a limit to their use. It is therefore necessary to design neuromorphic architectures, i.e. hardware accelerators that fit to the parallel and distributed computation paradigm of neural networks for reducing their hardware cost implementation. We mainly focus on the optimization of energy consumption to enable integration in embedded systems.For this purpose, we implement two models of artificial neural networks coming from two different scientific domains: the multi-layer perceptron derived from machine learning and the spiking neural network inspired from neuroscience. We compare the performances of both approaches in terms of accuracy and hardware cost to find out the most attractive architecture for the design of embedded artificial intelligence. |
Databáze: | OpenAIRE |
Externí odkaz: |