Popis: |
Modeling, characterization and prediction with regard to chaotic systems has been an area of vigorous pursuit for many years. Linear methods, such as Fourier decomposition, do not distinguish chaotic dynamical behavior from noise. Of fundamental importance is the characterization of the physical system or the underlying equations from a given time series or phase space attractor, as well as the influence of noise in coupling across basin boundaries and modifications of the otherwise purely deterministic dynamics. Local approximation methods in relation to arbitrary chaotic attractors, in general, are insufficient to deduce the generating equations and conditions.1 A forward-feed, hidden-layer, neural network (FFNN), on the other hand, is manifestly a function generator, and when trained adequately on a chaotic time series, is shown to constitute a global approximation to the attractor.2 That is, a FFNN, trained upon a chaotic time series, becomes a functional realization of that time series, in the global sense. Furthermore, a FFNN is shown to course grain the noise, in a time series during training, in the least-squares sense.3 The functional realization property of the FFNN allows the possibility for data window extension, once it is trained on a stationary time series, which can be, in fact, a rather narrow window. This is accomplished in the FFNN by choosing the input, in the form of delay coordinates, from a portion of the original time series which was not part of the training set, and then feeding the output into the input; thus, the trained FFNN becomes self-generating, and facilitates data window extension. |