Text Classification Based on ReLU Activation Function of SAE Algorithm
Autor: | Shuang Qiu, Jia-le Cui, Mingyang Jiang, Yi-nan Lu, Zhili Pei |
---|---|
Rok vydání: | 2017 |
Předmět: |
Artificial neural network
Computer science business.industry Activation function Hyperbolic function Pattern recognition 02 engineering and technology Sigmoid function 010501 environmental sciences 01 natural sciences Test set 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Artificial intelligence business Algorithm 0105 earth and related environmental sciences |
Zdroj: | Advances in Neural Networks-ISNN 2017 ISBN: 9783319590714 ISNN (1) |
DOI: | 10.1007/978-3-319-59072-1_6 |
Popis: | In order to solve the deep self-coding neural network training process, the Sigmoid function back-propagation gradient is easy to disappear, a method based on ReLU activation function is proposed for training the self coding neural network. This paper analyzes the performance of different activation functions and comparing ReLU with traditional Tanh and Sigmoid activation function and in Reuters-21578 standard for experiments on the test set. The experimental results show that using ReLU as the activation function, not only can improve the network convergence speed, and can also improve the accuracy. |
Databáze: | OpenAIRE |
Externí odkaz: |