Refined Statistical Bounds for Classification Error Mismatches with Constrained Bayes Error

Autor: Yang, Zijian, Eminyan, Vahe, Schlüter, Ralf, Ney, Hermann
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: In statistical classification/multiple hypothesis testing and machine learning, a model distribution estimated from the training data is usually applied to replace the unknown true distribution in the Bayes decision rule, which introduces a mismatch between the Bayes error and the model-based classification error. In this work, we derive the classification error bound to study the relationship between the Kullback-Leibler divergence and the classification error mismatch. We first reconsider the statistical bounds based on classification error mismatch derived in previous works, employing a different method of derivation. Then, motivated by the observation that the Bayes error is typically low in machine learning tasks like speech recognition and pattern recognition, we derive a refined Kullback-Leibler-divergence-based bound on the error mismatch with the constraint that the Bayes error is lower than a threshold.
Comment: accepted at 2024 IEEE Information Theory Workshop
Databáze: arXiv