Prediction error quantification through probabilistic scaling -- EXTENDED VERSION
Autor: | Mirasierra, Victor, Mammarella, Martina, Dabbene, Fabrizio, Alamo, Teodoro |
---|---|
Rok vydání: | 2021 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | In this paper, we address the probabilistic error quantification of a general class of prediction methods. We consider a given prediction model and show how to obtain, through a sample-based approach, a probabilistic upper bound on the absolute value of the prediction error. The proposed scheme is based on a probabilistic scaling methodology in which the number of required randomized samples is independent of the complexity of the prediction model. The methodology is extended to address the case in which the probabilistic uncertain quantification is required to be valid for every member of a finite family of predictors. We illustrate the results of the paper by means of a numerical example. Comment: 8 pages, 2 figure |
Databáze: | arXiv |
Externí odkaz: |