Zobrazeno 1 - 10
of 170
pro vyhledávání: '"Duchi, John C."'
We address the challenge of constructing valid confidence intervals and sets in problems of prediction across multiple environments. We investigate two types of coverage suitable for these problems, extending the jackknife and split-conformal methods
Externí odkaz:
http://arxiv.org/abs/2403.16336
Autor:
Duchi, John C., Haque, Saminul
We present an information-theoretic lower bound for the problem of parameter estimation with time-uniform coverage guarantees. Via a new a reduction to sequential testing, we obtain stronger lower bounds that capture the hardness of the time-uniform
Externí odkaz:
http://arxiv.org/abs/2402.08794
We present PPI++: a computationally lightweight methodology for estimation and inference based on a small labeled dataset and a typically much larger dataset of machine-learning predictions. The methods automatically adapt to the quality of available
Externí odkaz:
http://arxiv.org/abs/2311.01453
The statistical machine learning community has demonstrated considerable resourcefulness over the years in developing highly expressive tools for estimation, prediction, and inference. The bedrock assumptions underlying these developments are that th
Externí odkaz:
http://arxiv.org/abs/2202.04166
We extend the Approximate-Proximal Point (aProx) family of model-based methods for solving stochastic convex optimization problems, including stochastic subgradient, proximal point, and bundle methods, to the minibatch and accelerated setting. To do
Externí odkaz:
http://arxiv.org/abs/2101.02696
We propose and analyze algorithms for distributionally robust optimization of convex losses with conditional value at risk (CVaR) and $\chi^2$ divergence uncertainty sets. We prove that our algorithms require a number of gradient evaluations independ
Externí odkaz:
http://arxiv.org/abs/2010.05893
While the traditional viewpoint in machine learning and statistics assumes training and testing samples come from the same population, practice belies this fiction. One strategy -- coming from robust statistics and optimization -- is thus to build a
Externí odkaz:
http://arxiv.org/abs/2008.04267
Autor:
Arjevani, Yossi, Carmon, Yair, Duchi, John C., Foster, Dylan J., Sekhari, Ayush, Sridharan, Karthik
We design an algorithm which finds an $\epsilon$-approximate stationary point (with $\|\nabla F(x)\|\le \epsilon$) using $O(\epsilon^{-3})$ stochastic gradient and Hessian-vector products, matching guarantees that were previously available only under
Externí odkaz:
http://arxiv.org/abs/2006.13476
Autor:
Asi, Hilal, Duchi, John C.
We develop two notions of instance optimality in differential privacy, inspired by classical statistical theory: one by defining a local minimax risk and the other by considering unbiased mechanisms and analogizing the Cramer-Rao bound, and we show t
Externí odkaz:
http://arxiv.org/abs/2005.10630
Autor:
Carmon, Yair, Duchi, John C.
Publikováno v:
SIAM Review, 62(2), 2020, pp. 395--436
We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global soluti
Externí odkaz:
http://arxiv.org/abs/2003.04546