Training neural networks with the GRG2 nonlinear optimizer
Autor: | Ming S. Hung, James W. Denton |
---|---|
Rok vydání: | 1993 |
Předmět: |
Information Systems and Management
General Computer Science Artificial neural network Computer science business.industry Management Science and Operations Research Machine learning computer.software_genre Industrial and Manufacturing Engineering Backpropagation Nonlinear programming Nonlinear system Modeling and Simulation Adaptive system Pattern recognition (psychology) Artificial intelligence Gradient descent business computer |
Zdroj: | European Journal of Operational Research. 69:83-91 |
ISSN: | 0377-2217 |
Popis: | Neural networks represent a new approach to artificial intelligence. By using biologically motivated intensively interconnected networks of simple processing elements, certain pattern recognition tasks can be accomplished much faster than with currently used techniques. The most popular means of training these networks is back propagation, a gradient descent technique. The introduction of back propagation revolutionized research in neural networks, but the method has serious drawbacks in training speed and scalability to large problems. This paper compares the use of a general-purpose nonlinear optimizer, GRG2, with back propagation in training neural networks. Parity problems of increasing size are used to evaluate the scalability of each method to larger problems. It was found that GRG2 not only found solutions much faster, but also found much better solutions. The use of nonlinear programming methods in training therefore has the potential to allow neural networks to be applied to problems that have previously been beyond their capabilities. |
Databáze: | OpenAIRE |
Externí odkaz: |