Autor: |
Pardede, Hilman F., Adhi, Purwoko, Zilvan, Vicky, Yuliani, Asri R., Arisal, Andria |
Předmět: |
|
Zdroj: |
Neural Processing Letters; Aug2023, Vol. 55 Issue 4, p5193-5214, 22p |
Abstrakt: |
In this paper, we present a generalization of sigmoid loss function by applying q -exponential ( q -exp) of Tsallis statistics. With this framework, we could relax and/or tighten-up the slopes of sigmoid loss which depend on the q values of q -exp. We called it q -sigmoid. Our derivation on q -sigmoid shows that the proposed loss function could give a way to explain learning rate factors, which in traditional gradient descent optimizations, are set heuristically and their values are selected empirically. Here, we relate it with Lipschitz constant and derive an adaptive way to determine q . We implement the proposed loss function on logistic regression for five datasets: MNIST, Cifar-10, Cifar-100, and two datasets for plant diseases detections. The experiments show that our method could outperform logistic regressions with sigmoid loss and fixed learning rate for some values of q . [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|