Symmetric Power Activation Functions for Deep Neural Networks

Autor: Yassine Berradi
Rok vydání: 2018
Předmět:
Zdroj: LOPAL
DOI: 10.1145/3230905.3230956
Popis: Common nonlinear activation functions with large saturation regions, like Sigmoid and Tanh, used for Deep Neural Networks (DNNs) can not guarantee useful and efficient training since they suffer from vanishing gradients problem. Rectified Linear Units is an activation function that deals with this problem and speeds up the learning process. For this reason, it became crucial to the recent success of DNNs. In order to beat the performance of DNNs with ReLU, we propose a new activation function technique for DNNs that deals with the positive part of ReLU. This technique consists of combining two power functions that are symmetric to the linear part of ReLU. These functions are used independently (one at a time) during the training process. For generalization, the mean function between the two considered functions is used as activation function for the trained DNNs. Experiment results of the proposed activation function technique for DNNs which are performed on real benchmark datasets demonstrate the effectiveness of this technique showing that it achieves good classification accuracies compared to Sigmoid, Tanh and ReLU.
Databáze: OpenAIRE