$\alpha$-Divergence Loss Function for Neural Density Ratio Estimation

Autor: Kitazawa, Yoshiaki
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Density ratio estimation (DRE) is a fundamental machine learning technique for capturing relationships between two probability distributions. State-of-the-art DRE methods estimate the density ratio using neural networks trained with loss functions derived from variational representations of $f$-divergence. However, existing methods face optimization challenges, such as overfitting due to lower-unbounded loss functions, biased mini-batch gradients, vanishing training loss gradients, and high sample requirements for Kullback-Leibler (KL) divergence loss functions. To address these issues, we focus on $\alpha$-divergence, which provides a suitable variational representation of $f$-divergence. Subsequently, a novel loss function for DRE, the $\alpha$-divergence loss function ($\alpha$-Div), is derived. $\alpha$-Div is concise but offers stable and effective optimization for DRE. The boundedness of $\alpha$-divergence provides the potential for successful DRE with data exhibiting high KL-divergence. Our numerical experiments demonstrate the effectiveness in optimization using $\alpha$-Div. However, the experiments also show that the proposed loss function offers no significant advantage over the KL-divergence loss function in terms of RMSE for DRE. This indicates that the accuracy of DRE is primarily determined by the amount of KL-divergence in the data and is less dependent on $\alpha$-divergence.
Comment: $\mathcal{T}_{\text{Lip}}$ in Theorem 7.1 (Theorem B.15.) was changed to the set of all locally Lipschitz continuous functions. In the previous version, $\mathcal{T}_{\text{Lip}}$ was defined as the set of all Lipschitz continuous functions, which is unsuitable for the statement of case (ii) in the theorem
Databáze: arXiv