A New Hardware Architecture for FPGA Implementation of Feed Forward Neural Networks

Autor: V.A Sumayyabeevi, N Aswathy, Jaimy James Poovely, S Chinnu
Rok vydání: 2021
Předmět:
Zdroj: 2021 2nd International Conference on Advances in Computing, Communication, Embedded and Secure Systems (ACCESS).
Popis: Artificial neural networks are very popular and fast-growing machine learning algorithms today. There exist a large number of ways for implementing ANN into reality. Generally, the main two techniques are neuromorphic programming and neural networks. This paper presents an overview of such methods. Nowadays machine learning chips are available with a high level of parallel designs, but deep neural network requires flexible and efficient hardware structure that can be perfect for any type of neural networks. Also, varieties of hardware topologies are available for FPGA implementation. This paper explains those architectural variations and suggests a new topology. The proposed architecture adopts the systolic structure and applies to any feed forward neural networks such as Multi-Layer Perceptron (MLP), Auto Encoder (AE) and, Logic Regression (LR). Unlike other hardware neural network structures, this architecture implements a single activation function block and the largest layer only. This paper also includes the implementation of a feed-forward neural network for digit recognition (0 to 9) in the Zynq-7000 board with MNIST as the dataset. Different activation functions and different parameters of each activation function are used for the network. Changes and improvements are mentioned in this paper based on Accuracy, Operating frequency and, Resource usage. Logistic Sigmoidal functions can achieve more accuracy and performance as compared with others.
Databáze: OpenAIRE