FPGA Implementation of a Fully and Partially Connected MLP.

Autor: Omondi, Amos R., Rajapakse, Jagath C., Canas, Antonio, Ortigosa, Eva M., Ros, Eduardo, Ortigosa, Pilar M.
Zdroj: FPGA Implementations of Neural Networks; 2006, p271-296, 26p
Abstrakt: In this work, we present several hardware implementations of a standard Multi-Layer Perceptron (MLP) and a modified version called eXtended Multi-Layer Perceptron (XMLP). This extended version is an MLP-like feed-forward network with two-dimensional layers and configurable connection pathways. The interlayer connectivity can be restricted according to well-defined patterns. This aspect produces a faster and smaller system with similar classification capabilities. The presented hardware implementations of this network model take full advantage of this optimization feature. Furthermore the software version of the XMLP allows configurable activation functions and batched backpropagation with different smoothing-momentum alternatives. The hardware implementations have been developed and tested on an FPGA prototyping board. The designs have been defined using two different abstraction levels: register transfer level (VHDL) and a higher algorithmic-like level (Handel-C). We compare the two description strategies. Furthermore we study different implementation versions with diverse degrees of parallelism. The test bed application addressed is speech recognition. The implementations described here could be used for low-cost portable systems. We include a short study of the implementation costs (silicon area), speed and required computational resources. [ABSTRACT FROM AUTHOR]
Databáze: Supplemental Index