Improving Training in the Vicinity of Temporary Minima

Autor: Michael Margaliot, Ido Roth
Rok vydání: 2009
Předmět:
Zdroj: Lecture Notes in Computer Science ISBN: 9783642024771
IWANN (1)
DOI: 10.1007/978-3-642-02478-8_17
Popis: An important problem in learning using gradient descent algorithms (such as backprop) is the slowdown incurred by temporary minima (TM). We consider this problem for an artificial neural network trained to solve the XOR problem. The network is transformed into the equivalent all permutations fuzzy rule-base which provides a symbolic representation of the knowledge embedded in the network. We develop a mathematical model for the evolution of the fuzzy rule-base parameters during learning in the vicinity of TM. We show that the rule-base becomes singular and tends to remain singular in the vicinity of TM. Our analysis suggests a simple remedy for overcoming the slowdown in the learning process incurred by TM. This is based on slightly perturbing the values of the training examples, so that they are no longer symmetric. Simulations demonstrate the usefulness of this approach.
Databáze: OpenAIRE