Autor: |
Pravin Chandra, Udayan Ghose, Apoorvi Sood |
Rok vydání: |
2021 |
Předmět: |
|
Zdroj: |
Lecture Notes in Networks and Systems ISBN: 9789811597114 |
DOI: |
10.1007/978-981-15-9712-1_24 |
Popis: |
Activation functions play a major role in determining the learning speed and generalization capability of feed-forward artificial neural networks. In this paper, an empirical comparison of eight activation functions is reported on 12 function approximation problems. The study allows us to assert that the sigmoidal class of activation functions performed much better than the non-sigmoidal class of activation functions. Out of the six non-sigmoidal activation function, one activation function called the sigmoidal-weighted linear unit is identified as outperforming all other non-sigmoidal activation functions. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|