Zobrazeno 1 - 10
of 16 539
pro vyhledávání: '"A. Hendrikx"'
Autor:
Bakker, P.S.1 (AUTHOR)
Publikováno v:
Maandblad voor Vermogensrecht. 2024, Vol. 34 Issue 1, p1-5. 5p.
Publikováno v:
Transactions on Machine Learning Research 2024
Weight averaging of Stochastic Gradient Descent (SGD) iterates is a popular method for training deep learning models. While it is often used as part of complex training pipelines to improve generalization or serve as a `teacher' model, weight averagi
Externí odkaz:
http://arxiv.org/abs/2411.18704
Distributed approaches have many computational benefits, but they are vulnerable to attacks from a subset of devices transmitting incorrect information. This paper investigates Byzantine-resilient algorithms in a decentralized setting, where devices
Externí odkaz:
http://arxiv.org/abs/2410.10418
Distributed approaches have many computational benefits, but they are vulnerable to attacks from a subset of devices transmitting incorrect information. This paper investigates Byzantine-resilient algorithms in a decentralized setting, where devices
Externí odkaz:
http://arxiv.org/abs/2405.03449
Autor:
Hendrikx, Hadrien
Mirror Descent is a popular algorithm, that extends Gradients Descent (GD) beyond the Euclidean geometry. One of its benefits is to enable strong convergence guarantees through smooth-like analyses, even for objectives with exploding or vanishing cur
Externí odkaz:
http://arxiv.org/abs/2404.12213
The Gaussian Mechanism (GM), which consists in adding Gaussian noise to a vector-valued query before releasing it, is a standard privacy protection mechanism. In particular, given that the query respects some L2 sensitivity property (the L2 distance
Externí odkaz:
http://arxiv.org/abs/2308.15250
Gradient clipping is a popular modification to standard (stochastic) gradient descent, at every iteration limiting the gradient norm to a certain value $c >0$. It is widely used for example for stabilizing the training of deep learning models (Goodfe
Externí odkaz:
http://arxiv.org/abs/2305.01588
In data-parallel optimization of machine learning models, workers collaborate to improve their estimates of the model: more accurate gradients allow them to use larger learning rates and optimize faster. In the decentralized setting, in which workers
Externí odkaz:
http://arxiv.org/abs/2301.02151
Publikováno v:
Arctic, Antarctic, and Alpine Research, Vol 56, Iss 1 (2024)
ABSTRACTSnow avalanches are a hazard and ecological disturbance across mountain landscapes worldwide. Understanding how avalanche frequency affects forests and vegetation improves infrastructure planning, risk management, and avalanche forecasting. W
Externí odkaz:
https://doaj.org/article/64113ab265f44d60b2b5bd84215395c1
Publikováno v:
Frontiers in Immunology, Vol 15 (2024)
Externí odkaz:
https://doaj.org/article/229e2945aba74843a0ef50f1ab2ec0f9