RECURRENT NEURAL NETWORKS ARE UNIVERSAL APPROXIMATORS.

Autor: SCHÄFER, ANTON MAXIMILIAN, ZIMMERMANN, HANS-GEORG
Předmět:
Zdroj: International Journal of Neural Systems; Aug2007, Vol. 17 Issue 4, p253-263, 11p, 4 Diagrams
Abstrakt: Recurrent Neural Networks (RNN) have been developed for a better understanding and analysis of open dynamical systems. Still the question often arises if RNN are able to map every open dynamical system, which would be desirable for a broad spectrum of applications. In this article we give a proof for the universal approximation ability of RNN in state space model form and even extend it to Error Correction and Normalized Recurrent Neural Networks. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index