Zobrazeno 1 - 10
of 12
pro vyhledávání: '"Kfir Yehuda Levy"'
Publikováno v:
International Journal of Electrical Power & Energy Systems. 148:108949
Publikováno v:
Scopus-Elsevier
In this work we investigate stochastic non-convex optimization problems where the objective is an expectation over smooth loss functions, and the goal is to find an approximate stationary point. The most popular approach to handling such problems is
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::b55a0a908896c2d4c71b0adc6b709b97
Publikováno v:
Scopus-Elsevier
Adaptive importance sampling for stochastic optimization is a promising approach that offers improved convergence through variance reduction. In this work, we propose a new framework for variance reduction that enables the use of mixtures over predef
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::679726d2e80da9b00fe878eb4e6d7ccf
http://arxiv.org/abs/1903.12416
http://arxiv.org/abs/1903.12416
Autor:
Kfir Yehuda Levy, Krause, A.
Publikováno v:
Proceedings of Machine Learning Research, 89
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS 2019)
Scopus-Elsevier
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS 2019)
Scopus-Elsevier
ISSN:2640-3498
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=dedup_wf_001::49f154058afacb43502fcde1feec575c
https://hdl.handle.net/20.500.11850/394317
https://hdl.handle.net/20.500.11850/394317
Publikováno v:
Scopus-Elsevier
We present a novel method for convex unconstrained optimization that, without any modifications, ensures: (i) accelerated convergence rate for smooth objectives, (ii) standard convergence rate in the general (non-smooth) setting, and (iii) standard c
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::f3cedc0ea0359cbd127d1ee145245760
Publikováno v:
Scopus-Elsevier
We consider the problem of training generative models with a Generative Adversarial Network (GAN). Although GANs can accurately model complex distributions, they are known to be difficult to train due to instabilities caused by a difficult minimax op
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::18c41fa74f2a2c1830f06a01334baf1e
http://www.scopus.com/inward/record.url?eid=2-s2.0-85083950788&partnerID=MN8TOARS
http://www.scopus.com/inward/record.url?eid=2-s2.0-85083950788&partnerID=MN8TOARS
Autor:
Kfir Yehuda Levy
Publikováno v:
Scopus-Elsevier
We present an approach towards convex optimization that relies on a novel scheme which converts online adaptive algorithms into offline methods. In the offline optimization setting, our derived methods are shown to obtain favourable adaptive guarante
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::e28c2bd3b206b9a21c47363d222f2387
http://www.scopus.com/inward/record.url?eid=2-s2.0-85046999244&partnerID=MN8TOARS
http://www.scopus.com/inward/record.url?eid=2-s2.0-85046999244&partnerID=MN8TOARS
Publikováno v:
Scopus-Elsevier
We propose a novel technique for faster deep neural network training which systematically applies sample-based approximation to the constituent tensor operations, i.e., matrix multiplications and convolutions. We introduce new sampling techniques, st
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::f07b2683a61ab90d384a84288f0c9676
http://www.scopus.com/inward/record.url?eid=2-s2.0-85131866071&partnerID=MN8TOARS
http://www.scopus.com/inward/record.url?eid=2-s2.0-85131866071&partnerID=MN8TOARS
Publikováno v:
Scopus-Elsevier
In high-stakes machine learning applications, it is crucial to not only perform well on average, but also when restricted to difficult examples. To address this, we consider the problem of training models in a risk-averse manner. We propose an adapti
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::75f276f5fd875c5659c6fd4e8323a34b
http://www.scopus.com/inward/record.url?eid=2-s2.0-85101863929&partnerID=MN8TOARS
http://www.scopus.com/inward/record.url?eid=2-s2.0-85101863929&partnerID=MN8TOARS
Publikováno v:
Scopus-Elsevier
The graduated optimization approach, also known as the continuation method, is a popular heuristic to solving non-convex problems that has received renewed interest over the last decade. Despite its popularity, very little is known in terms of theore
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::05aadab2a97e9a157ee8fcdbef24b86d
http://www.scopus.com/inward/record.url?eid=2-s2.0-84998952820&partnerID=MN8TOARS
http://www.scopus.com/inward/record.url?eid=2-s2.0-84998952820&partnerID=MN8TOARS