Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Hong, Yusu"'
Selecting the best code solution from multiple generated ones is an essential task in code generation, which can be achieved by using some reliable validators (e.g., developer-written test cases) for assistance. Since reliable test cases are not alwa
Externí odkaz:
http://arxiv.org/abs/2409.08692
Autor:
Hong, Yusu, Lin, Junhong
In this study, we revisit the convergence of AdaGrad with momentum (covering AdaGrad as a special case) on non-convex smooth optimization problems. We consider a general noise model where the noise magnitude is controlled by the function value gap to
Externí odkaz:
http://arxiv.org/abs/2402.13794
Autor:
Hong, Yusu, Lin, Junhong
The Adaptive Momentum Estimation (Adam) algorithm is highly effective in training various deep learning tasks. Despite this, there's limited theoretical understanding for Adam, especially when focusing on its vanilla form in non-convex smooth scenari
Externí odkaz:
http://arxiv.org/abs/2402.03982
Autor:
Hong, Yusu, Lin, Junhong
In this paper, we study the convergence of the Adaptive Moment Estimation (Adam) algorithm under unconstrained non-convex smooth stochastic optimizations. Despite the widespread usage in machine learning areas, its theoretical properties remain limit
Externí odkaz:
http://arxiv.org/abs/2311.02000
Autor:
Hong, Yusu, Lin, Junhong
Publikováno v:
In Journal of Complexity February 2025 86