Efficient Input Uncertainty Quantification for Ratio Estimator

Autor: He, Linyun, Feng, Ben, Song, Eunhye
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: We study the construction of a confidence interval (CI) for a simulation output performance measure that accounts for input uncertainty when the input models are estimated from finite data. In particular, we focus on performance measures that can be expressed as a ratio of two dependent simulation outputs' means. We adopt the parametric bootstrap method to mimic input data sampling and construct the percentile bootstrap CI after estimating the ratio at each bootstrap sample. The standard estimator, which takes the ratio of two sample averages, tends to exhibit large finite-sample bias and variance, leading to overcoverage of the percentile bootstrap CI. To address this, we propose two new ratio estimators that replace the sample averages with pooled mean estimators via the $k$-nearest neighbor ($k$NN) regression: the $k$NN estimator and the $k$LR estimator. The $k$NN estimator performs well in low dimensions but its theoretical performance guarantee degrades as the dimension increases. The $k$LR estimator combines the likelihood ratio (LR) method with the $k$NN regression, leveraging the strengths of both while mitigating their weaknesses; the LR method removes dependence on dimension, while the variance inflation introduced by the LR is controlled by $k$NN. Based on asymptotic analyses and finite-sample heuristics, we propose an experiment design that maximizes the efficiency of the proposed estimators and demonstrate their empirical performances using three examples including one in the enterprise risk management application.
Databáze: arXiv