SinLU: Sinu-Sigmoidal Linear Unit

Autor: Jin Hee Yoon, Ram Sarkar, Zong Woo Geem, Ashis Paul, Rajarshi Bandyopadhyay
Jazyk: angličtina
Rok vydání: 2022
Předmět:
Zdroj: Mathematics, Vol 10, Iss 337, p 337 (2022)
Mathematics; Volume 10; Issue 3; Pages: 337
ISSN: 2227-7390
Popis: Non-linear activation functions are integral parts of deep neural architectures. Given the large and complex dataset of a neural network, its computational complexity and approximation capability can differ significantly based on what activation function is used. Parameterizing an activation function with the introduction of learnable parameters generally improves the performance. Herein, a novel activation function called Sinu-sigmoidal Linear Unit (or SinLU) is proposed. SinLU is formulated as SinLU(x)=(x+asinbx)·σ(x), where σ(x) is the sigmoid function. The proposed function incorporates the sine wave, allowing new functionalities over traditional linear unit activations. Two trainable parameters of this function control the participation of the sinusoidal nature in the function, and help to achieve an easily trainable, and fast converging function. The performance of the proposed SinLU is compared against widely used activation functions, such as ReLU, GELU and SiLU. We showed the robustness of the proposed activation function by conducting experiments in a wide array of domains, using multiple types of neural network-based models on some standard datasets. The use of sine wave with trainable parameters results in a better performance of SinLU than commonly used activation functions.
Databáze: OpenAIRE