Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Arthur Jacot"'
Autor:
Arthur Jacot, Matthieu Wyart, Giulio Biroli, Stéphane d'Ascoli, Clément Hongler, Franck Gabriel, Mario Geiger, Stefano Spigler, Levent Sagun
Supervised deep learning involves the training of neural networks with a large number $N$ of parameters. For large enough $N$, in the so-called over-parametrized regime, one can essentially fit the training data points. Sparsity-based arguments would
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::e145c059482d2ea1a506411ff1a2a052
Autor:
Mario Geiger, Arthur Jacot, Stefano Spigler, Franck Gabriel, Levent Sagun, Stéphane d’Ascoli, Giulio Biroli, Clément Hongler, Matthieu Wyart
Publikováno v:
Journal of Statistical Mechanics: Theory & Experiment; Feb2020, Vol. 2020 Issue 2, p1-1, 1p
Two distinct limits for deep learning have been derived as the network width $h\rightarrow \infty$, depending on how the weights of the last layer scale with $h$. In the Neural Tangent Kernel (NTK) limit, the dynamics becomes linear in the weights an
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::faf5ef819a22c41714fe44e5d4079fd6
https://infoscience.epfl.ch/record/282180
https://infoscience.epfl.ch/record/282180
Publikováno v:
STOC
The Neural Tangent Kernel is a new way to understand the gradient descent in deep neural networks, connecting them with kernel methods. In this talk, I'll introduce this formalism and give a number of results on the Neural Tangent Kernel and explain
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::faac3cf3882bacc97d50fd35c33cad53
https://infoscience.epfl.ch/record/295224
https://infoscience.epfl.ch/record/295224