Analogue imprecision in MLPs implications and learning improvements

Autor: Edwards, Peter J.
Rok vydání: 1994
Předmět:
Druh dokumentu: Electronic Thesis or Dissertation
Popis: Analogue hardware implementations of Multi-Layer Perceptrons (MLP) have a limited precision that has a detrimental effect on the result of synaptic multiplication. At the same time however the accuracy of the circuits can be very high with good design. This thesis investigates the consequences of the imprecision on the performance of the MLP, examining whether it is accuracy or precision that is of importance in neural computation. The results of this thesis demonstrate that far from having a detrimental effect, the imprecision or synaptic weight noise enhances the performance of the solution. In particular the fault tolerance and generalisation ability are improved. In addition, under certain conditions, the learning trajectory of the training network is also improved. Through a mathematical analysis and subsequent verification experiments the enhancements are reported. Simulation experiments examine the underlying mechanisms and probe the limitations of the technique as an enhancement scheme. For a variety of problems, precision is shown to be significantly less important than accuracy. In fact imprecision can have beneficial effects on learning performance.
Databáze: Networked Digital Library of Theses & Dissertations