Neural Grammar Networks.

Autor: Ma, Eddie Y. T., Kremer, Stefan C.
Zdroj: Innovations in Neural Information Paradigms & Applications; 2009, p67-96, 30p
Abstrakt: Artificial Neural Networks (ANNs) (Haykin, 1998) are universal function approximators (Hornik et al., 1989) with adaptive behaviour based on gradient descent in error space (Werbos, 1994) and other methods. These networks have been applied to a wide variety of problems from a broad range of domains that can be formulated as vectorto- vector mappings. That is, the function to be approximated must be represented as a function whose domain and whose range are given by two vector spaces (Figure 1). Many methods have been used to translate data into such spaces. A variety of interesting problems, including those that process measurements from a fixed set of sensors naturally lend themselves to vector representations.When data is not easily encoded in fixed-size vectors, a number of transformations of data have been proposed including padding the data, frequency space representations, windowing and others. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index