Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Fishkov, Alexander"'
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution $P_{Y \mid X}$. Existing methods, such as conformalized quantile regression and probabilistic c
Externí odkaz:
http://arxiv.org/abs/2407.01794
Autor:
Plassier, Vincent, Kotelevskii, Nikita, Rubashevskii, Aleksandr, Noskov, Fedor, Velikanov, Maksim, Fishkov, Alexander, Horvath, Samuel, Takac, Martin, Moulines, Eric, Panov, Maxim
Conformal Prediction (CP) stands out as a robust framework for uncertainty quantification, which is crucial for ensuring the reliability of predictions. However, common CP methods heavily rely on data exchangeability, a condition often violated in pr
Externí odkaz:
http://arxiv.org/abs/2312.15799
Prediction with the possibility of abstention (or selective prediction) is an important problem for error-critical machine learning applications. While well-studied in the classification setup, selective approaches to regression are much less develop
Externí odkaz:
http://arxiv.org/abs/2309.16412
Autor:
Fishkov, Alexander, Panov, Maxim
Accounting for the uncertainty in the predictions of modern neural networks is a challenging and important task in many domains. Existing algorithms for uncertainty estimation require modifying the model architecture and training procedure (e.g., Bay
Externí odkaz:
http://arxiv.org/abs/2205.03194
Autor:
Kotelevskii, Nikita, Artemenkov, Aleksandr, Fedyanin, Kirill, Noskov, Fedor, Fishkov, Alexander, Shelmanov, Artem, Vazhentsev, Artem, Petiushko, Aleksandr, Panov, Maxim
This paper proposes a fast and scalable method for uncertainty quantification of machine learning models' predictions. First, we show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametr
Externí odkaz:
http://arxiv.org/abs/2202.03101