Popis: |
One of the most widely-used artificial neural networks is the multi-layer perceptron, trained by error-back-propagation (the 'back-propagation algorithm'). Commonly, the network is implemented as a serial-computer simulation, but there has been considerable interest in translating it into hardware. The most difficult translation into analogue VLSI is the 'learning' part of the algorithm, that is the part which involves calculating the output errors and making appropriate modifications to the analogue weights representing the connections between nodes. For this reason, most analogue hardware implementations train weights held off-chip in a digital representation; the weights are converted to an analogue representation for storage on the chip which comprises the network. This thesis examines the Virtual Targets algorithm, based on back-propagation, but with some modifications which render it more amenable to translation into analogue VLSI circuits which can 'learn on-chip'. I describe several circuits, designed to exploit our research group's pulse-stream approach to analogue VLSI, which provide four-quadrant multiplication, and calculate differences, signs and error-derivatives. Results, from simulation and from a chip fabricated with the circuits, are given. A consideration of other approaches to the problem of learning on-chip makes it clear that key issues are weight-storage, and a means of modifying the weights. I explain why calculating exact weight-changes is difficult, and give the results of simulation experiments leading to a further simplification of the Virtual Targets algorithm which makes it possible to train the network using fixed increments and decrements of the weights. I show the results of tests of circuits on a second chip, designed with implementation of the entire algorithm in mind, and assess the likelihood of such an implementation being successful. I place this analysis in the context of the search for 'intelligent' machines, and ask how far designs such as my own might contribute to such a machine. I also make some suggestions on the most fruitful directions for analogue designs of artificial neural networks. |