Zobrazeno 1 - 10
of 245
pro vyhledávání: '"Nobuo Yamashita"'
Autor:
Kazushi Ikeda, Yoshiumi Kawamura, Kazuhisa Makino, Satoshi Tsujimoto, Nobuo Yamashita, Shintaro Yoshizawa, Hanna Sumita
This open access book presents the mathematical methods for huge data and network analysis.The automotive industry has made steady progress in technological innovations under the names of Connected Autonomous-Shared-Electric (CASE) and Mobility as a
Autor:
Shota Yamanaka, Nobuo Yamashita
Publikováno v:
Optimization. :1-31
Recently, Yamanaka and Yamashita proposed the so-called positively homogeneous optimization problem, which includes many important problems, such as the absolute-value and the gauge optimizations. They presented a closed form of the dual formulation
Publikováno v:
In Journal of Bioscience and Bioengineering 2000 90(2):226-226
The stochastic variational inequality problem (SVIP) is an equilibrium model that includes random variables and has been widely applied in various fields such as economics and engineering. Expected residual minimization (ERM) is an established model
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::d058b61541ab7b194bce60b3d616c484
http://arxiv.org/abs/2111.07500
http://arxiv.org/abs/2111.07500
Publikováno v:
Journal of Global Optimization. 75:63-90
In the last two decades, many descent methods for multiobjective optimization problems were proposed. In particular, the steepest descent and the Newton methods were studied for the unconstrained case. In both methods, the search directions are compu
Publikováno v:
Journal of Bioscience and Bioengineering. 127:710-713
Dimethyl trisulfide (DMTS) is the main component of hineka, an off-flavor generated in sake during storage. Genshu, or undiluted sake, is usually diluted with water during warimizu, the process of adjusting the alcohol content of sake. In this study,
Publikováno v:
Journal of Optimization Theory and Applications. 181:883-904
The augmented Lagrangian method is a classical solution method for nonlinear optimization problems. At each iteration, it minimizes an augmented Lagrangian function that consists of the constraint functions and the corresponding Lagrange multipliers.
Autor:
Nobuo Yamashita, Yan Gu
Publikováno v:
Computational and Applied Mathematics. 40
The alternating direction method of multipliers (ADMM) is an effective method for solving wide fields of convex problems. At each iteration, the classical ADMM solves two subproblems exactly. However, in many applications, it is expensive or impossib
The limited memory BFGS (L-BFGS) method is one of the popular methods for solving large-scale unconstrained optimization. Since the standard L-BFGS method uses a line search to guarantee its global convergence, it sometimes requires a large number of
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::3ecb558fad41f4ce5166e1af6a61c386
http://arxiv.org/abs/2101.04413
http://arxiv.org/abs/2101.04413
Many descent algorithms for multiobjective optimization have been developed in the last two decades. Tanabe et al. (Comput Optim Appl 72(2):339--361, 2019) proposed a proximal gradient method for multiobjective optimization, which can solve multiobje
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::c8ff0066a595787e137a4e6ff99e0542
http://arxiv.org/abs/2010.08217
http://arxiv.org/abs/2010.08217