Zobrazeno 1 - 10
of 213
pro vyhledávání: '"Gotoh, Jun"'
Autor:
Yagishita, Shotaro, Gotoh, Jun-ya
This paper presents a new approach to selecting knots at the same time as estimating the B-spline regression model. Such simultaneous selection of knots and model is not trivial, but our strategy can make it possible by employing a nonconvex regulari
Externí odkaz:
http://arxiv.org/abs/2304.02306
Autor:
Yagishita, Shotaro, Gotoh, Jun-ya
This paper studies the properties of d-stationary points of the trimmed lasso (Luo et al., 2013, Huang et al., 2015, and Gotoh et al., 2018) and the composite optimization problem with the truncated nuclear norm (Gao and Sun, 2010, and Zhang et al.,
Externí odkaz:
http://arxiv.org/abs/2209.02315
While solutions of Distributionally Robust Optimization (DRO) problems can sometimes have a higher out-of-sample expected reward than the Sample Average Approximation (SAA), there is no guarantee. In this paper, we introduce a class of Distributional
Externí odkaz:
http://arxiv.org/abs/2105.12342
Autor:
Yagishita, Shotaro, Gotoh, Jun-ya
Network Lasso (NL for short) is a methodology for estimating models by simultaneously clustering data samples and fitting the models to the samples. It often succeeds in forming clusters thanks to the geometry of the $\ell_1$-regularizer employed the
Externí odkaz:
http://arxiv.org/abs/2012.07491
We introduce the notion of Worst-Case Sensitivity, defined as the worst-case rate of increase in the expected cost of a Distributionally Robust Optimization (DRO) model when the size of the uncertainty set vanishes. We show that worst-case sensitivit
Externí odkaz:
http://arxiv.org/abs/2010.10794
Autor:
Nakayama, Shummin, Gotoh, Jun-ya
This paper conducts a comparative study of proximal gradient methods (PGMs) and proximal DC algorithms (PDCAs) for sparse regression problems which can be cast as Difference-of-two-Convex-functions (DC) optimization problems. It has been shown that f
Externí odkaz:
http://arxiv.org/abs/2007.01169
Autor:
Takano, Yuichi, Gotoh, Jun-ya
Publikováno v:
In Operations Research Perspectives 2023 10
We study the out-of-sample properties of robust empirical optimization problems with smooth $\phi$-divergence penalties and smooth concave objective functions, and develop a theory for data-driven calibration of the non-negative "robustness parameter
Externí odkaz:
http://arxiv.org/abs/1711.06565
We address the minimization of a smooth objective function under an $\ell_0$-constraint and simple convex constraints. When the problem has no constraints except the $\ell_0$-constraint, some efficient algorithms are available; for example, Proximal
Externí odkaz:
http://arxiv.org/abs/1701.08498
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.