Epistemic uncertainty quantification in deep learning classification by the Delta method

Autor: Morten Brun, Antonella Zanna Munthe-Kaas, Hans J. Skaug, Geir Kjetil Nilsen
Rok vydání: 2021
Předmět:
Zdroj: Neural Networks
ISSN: 1879-2782
Popis: The Delta method is a classical procedure for quantifying epistemic uncertainty in statistical models, but its direct application to deep neural networks is prevented by the large number of parameters P . We propose a low cost approximation of the Delta method applicable to L 2 -regularized deep neural networks based on the top K eigenpairs of the Fisher information matrix. We address efficient computation of full-rank approximate eigendecompositions in terms of the exact inverse Hessian, the inverse outer-products of gradients approximation and the so-called Sandwich estimator. Moreover, we provide bounds on the approximation error for the uncertainty of the predictive class probabilities. We show that when the smallest computed eigenvalue of the Fisher information matrix is near the L 2 -regularization rate, the approximation error will be close to zero even when K ≪ P . A demonstration of the methodology is presented using a TensorFlow implementation, and we show that meaningful rankings of images based on predictive uncertainty can be obtained for two LeNet and ResNet-based neural networks using the MNIST and CIFAR-10 datasets. Further, we observe that false positives have on average a higher predictive epistemic uncertainty than true positives. This suggests that there is supplementing information in the uncertainty measure not captured by the classification alone.
Databáze: OpenAIRE