Beyond Top-Class Agreement: Using Divergences to Forecast Performance under Distribution Shift
Autor: | Schirmer, Mona, Zhang, Dan, Nalisnick, Eric |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Knowing if a model will generalize to data 'in the wild' is crucial for safe deployment. To this end, we study model disagreement notions that consider the full predictive distribution - specifically disagreement based on Hellinger distance, Jensen-Shannon and Kullback-Leibler divergence. We find that divergence-based scores provide better test error estimates and detection rates on out-of-distribution data compared to their top-1 counterparts. Experiments involve standard vision and foundation models. Comment: Workshop on Distribution Shifts, 37th Conference on Neural Information Processing Systems (NeurIPS 2023) |
Databáze: | arXiv |
Externí odkaz: |