Why Rectified Power Unit Networks Fail and How to Improve It: An Effective Theory Perspective

Autor: Kim, Taeyoung, Kang, Myungjoo
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: The Rectified Power Unit (RePU) activation functions, unlike the Rectified Linear Unit (ReLU), have the advantage of being a differentiable function when constructing neural networks. However, it can be experimentally observed when deep layers are stacked, neural networks constructed with RePU encounter critical issues. These issues include the values exploding or vanishing and failure of training. And these happen regardless of the hyperparameter initialization. From the perspective of effective theory, we aim to identify the causes of this phenomenon and propose a new activation function that retains the advantages of RePU while overcoming its drawbacks.
Comment: 25 pages, 8 figures
Databáze: arXiv