Popis: |
For a single hidden layer feedforward artificial neural network to possess the universal approximation property, it is sufficient that the hidden layer nodes activation functions are continuous non-polynomial function. It is not required that the activation function be a sigmoidal function. In this paper a simple continuous, bounded, non-constant, differentiable, non-sigmoid and non-polynomial function is proposed, for usage as the activation function at hidden layer nodes. The proposed activation function does require the computation of an exponential function, and thus is computationally less intensive as compared to either the log-sigmoid or the hyperbolic tangent function. On a set of 10 function approximation tasks we demonstrate the efficiency and efficacy of the usage of the proposed activation functions. The results obtained allow us to assert that, at least on the 10 function approximation tasks, the results demonstrate that in equal epochs of training, the networks using the proposed activation function reach deeper minima of the error functional and also generalize better in most of the cases, and statistically are as good as if not better than networks using the logistic function as the activation function at the hidden nodes. |