Text Classification Based on ReLU Activation Function of SAE Algorithm

Autor: Shuang Qiu, Jia-le Cui, Mingyang Jiang, Yi-nan Lu, Zhili Pei
Rok vydání: 2017
Předmět:
Zdroj: Advances in Neural Networks-ISNN 2017 ISBN: 9783319590714
ISNN (1)
DOI: 10.1007/978-3-319-59072-1_6
Popis: In order to solve the deep self-coding neural network training process, the Sigmoid function back-propagation gradient is easy to disappear, a method based on ReLU activation function is proposed for training the self coding neural network. This paper analyzes the performance of different activation functions and comparing ReLU with traditional Tanh and Sigmoid activation function and in Reuters-21578 standard for experiments on the test set. The experimental results show that using ReLU as the activation function, not only can improve the network convergence speed, and can also improve the accuracy.
Databáze: OpenAIRE