Zobrazeno 1 - 10
of 12 081
pro vyhledávání: '"Grimmer"'
Recent works by Altschuler and Parrilo and the authors have shown that it is possible to accelerate the convergence of gradient descent on smooth convex functions, even without momentum, just by picking special stepsizes. In this paper, we provide a
Externí odkaz:
http://arxiv.org/abs/2410.16249
Autor:
Luner, Alan, Grimmer, Benjamin
We extend recent computer-assisted design and analysis techniques for first-order optimization over structured functions--known as performance estimation--to apply to structured sets. We prove "interpolation theorems" for smooth and strongly convex s
Externí odkaz:
http://arxiv.org/abs/2410.14811
We introduce AutoPersuade, a three-part framework for constructing persuasive messages. First, we curate a large dataset of arguments with human evaluations. Next, we develop a novel topic model to identify argument features that influence persuasive
Externí odkaz:
http://arxiv.org/abs/2410.08917
Autor:
Grimmer, Marcel, Busch, Christoph
Face morphing attacks pose a severe security threat to face recognition systems, enabling the morphed face image to be verified against multiple identities. To detect such manipulated images, the development of new face morphing methods becomes essen
Externí odkaz:
http://arxiv.org/abs/2410.07988
Drori and Teboulle [4] conjectured that the minimax optimal constant stepsize for N steps of gradient descent is given by the stepsize that balances performance on Huber and quadratic objective functions. This was numerically supported by semidefinit
Externí odkaz:
http://arxiv.org/abs/2407.11739
Autor:
Ortner M, Stange M, Schneider H, Schröder C, Buerger K, Müller C, Müller-Sarnowski F, Diehl-Schmid J, Förstl H, Grimmer T, Steimer W
Publikováno v:
Drug Design, Development and Therapy, Vol Volume 14, Pp 3251-3262 (2020)
Marion Ortner,1 Marion Stange,1 Heike Schneider,1 Charlotte Schröder,2 Katharina Buerger,3 Claudia Müller,3 Felix Müller-Sarnowski,1 Janine Diehl-Schmid,1 Hans Förstl,1 Timo Grimmer1 ,* Werner Steimer2, * 1Department of Psychiatry and Psychothera
Externí odkaz:
https://doaj.org/article/b19835ae5be245aa8c0083f2555406ee
Autor:
Ortner M, Hauser C, Schmaderer C, Muggenthaler C, Hapfelmeier A, Sorg C, Diehl-Schmid J, Kurz A, Förstl H, Ikenberg B, Kotliar K, Poppert H, Grimmer T
Publikováno v:
Neuropsychiatric Disease and Treatment, Vol Volume 15, Pp 3487-3499 (2019)
Marion Ortner,1 Christine Hauser,2 Christoph Schmaderer,2 Claudia Muggenthaler,1 Alexander Hapfelmeier,3 Christian Sorg,1,4 Janine Diehl-Schmid,1 Alexander Kurz,1 Hans Förstl,1 Benno Ikenberg,5 Konstantin Kotliar,6 Holger Poppert,5,7,* Timo Grimmer1
Externí odkaz:
https://doaj.org/article/49f043aa2d6c42e6882554ba4c0608fd
This work considers gradient descent for L-smooth convex optimization with stepsizes larger than the classic regime where descent can be ensured. The stepsize schedules considered are similar to but differ slightly from the recent silver stepsizes of
Externí odkaz:
http://arxiv.org/abs/2403.14045
Autor:
Samakhoana, Thabo, Grimmer, Benjamin
Recent works have developed new projection-free first-order methods based on utilizing linesearches and normal vector computations to maintain feasibility. These oracles can be cheaper than orthogonal projection or linear optimization subroutines but
Externí odkaz:
http://arxiv.org/abs/2403.13688
Autor:
Luner, Alan, Grimmer, Benjamin
This work considers the effect of averaging, and more generally extrapolation, of the iterates of gradient descent in smooth convex optimization. After running the method, rather than reporting the final iterate, one can report either a convex combin
Externí odkaz:
http://arxiv.org/abs/2402.12493