A Theoretical Analysis of the Learning Dynamics under Class Imbalance

Autor: Francazi, Emanuele, Baity-Jesi, Marco, Lucchi, Aurelien
Rok vydání: 2022
Předmět:
Zdroj: International Conference on Machine Learning 2023, (PMLR) 10285-10322
Druh dokumentu: Working Paper
Popis: Data imbalance is a common problem in machine learning that can have a critical effect on the performance of a model. Various solutions exist but their impact on the convergence of the learning dynamics is not understood. Here, we elucidate the significant negative impact of data imbalance on learning, showing that the learning curves for minority and majority classes follow sub-optimal trajectories when training with a gradient-based optimizer. This slowdown is related to the imbalance ratio and can be traced back to a competition between the optimization of different classes. Our main contribution is the analysis of the convergence of full-batch (GD) and stochastic gradient descent (SGD), and of variants that renormalize the contribution of each per-class gradient. We find that GD is not guaranteed to decrease the loss for each class but that this problem can be addressed by performing a per-class normalization of the gradient. With SGD, class imbalance has an additional effect on the direction of the gradients: the minority class suffers from a higher directional noise, which reduces the effectiveness of the per-class gradient normalization. Our findings not only allow us to understand the potential and limitations of strategies involving the per-class gradients, but also the reason for the effectiveness of previously used solutions for class imbalance such as oversampling.
Comment: In the latest update of our paper, we've refined the formulations of the theorems and their proofs in the appendix to improve clarity
Databáze: arXiv