CONFIGURING AND OPTIMIZING THE BACK-PROPAGATION NETWORK

Autor: Stanley P. Franklin, Alianna J. Maren, Dan Jones
Rok vydání: 1990
Předmět:
DOI: 10.1016/b978-0-12-546090-3.50019-x
Popis: This chapter focuses on some of the ways of configuring and optimizing a back-propagation network. Some of the major ways to improve the performance of back-propagation or similar multi-layer feedforward networks are by modifying the structure, the dynamics, or the network training and learning rules. The structure of a basic feedforward network can be modified in several different ways. At the microstructural level, a new transfer function or creating new types of connection weights can be used. The microstructure of a Perceptron-like network can be modified by using a radial basis function as the transfer function in the hidden layer. Radial basis functions are particularly useful for complex mapping tasks where the mapping is continuous. The high-order connection networks, sometimes called “sigma-pi” networks, achieve greater processing power by using complex connections. Various combinations of inputs are multiplied together. The functional-link network achieves nonlinear responses using a single-layer net without hidden nodes. This is accomplished by applying nonlinear functions to some or all of the inputs before they are fed into the network.
Databáze: OpenAIRE