Zobrazeno 1 - 10
of 30
pro vyhledávání: '"Roith, Tim"'
A popular method to perform adversarial attacks on neuronal networks is the so-called fast gradient sign method and its iterative variant. In this paper, we interpret this method as an explicit Euler discretization of a differential inclusion, where
Externí odkaz:
http://arxiv.org/abs/2406.05376
Autor:
Bailo, Rafael, Barbaro, Alethea, Gomes, Susana N., Riedl, Konstantin, Roith, Tim, Totzeck, Claudia, Vaes, Urbain
Publikováno v:
Journal of Open Source Software (2024) 9(98)
We introduce CBXPy and ConsensusBasedX.jl, Python and Julia implementations of consensus-based interacting particle systems (CBX), which generalise consensus-based optimization methods (CBO) for global, derivative-free optimisation. The raison d'\^et
Externí odkaz:
http://arxiv.org/abs/2403.14470
This paper presents a method for finding a sparse representation of Barron functions. Specifically, given an $L^2$ function $f$, the inverse scale space flow is used to find a sparse measure $\mu$ minimising the $L^2$ loss between the Barron function
Externí odkaz:
http://arxiv.org/abs/2312.02671
In this paper we investigate the use of Fourier Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs). Neural operators are a discretization-invariant generalization of neural networks to appr
Externí odkaz:
http://arxiv.org/abs/2304.01227
In this paper we propose polarized consensus-based dynamics in order to make consensus-based optimization (CBO) and sampling (CBS) applicable for objective functions with several global minima or distributions with many modes, respectively. For this,
Externí odkaz:
http://arxiv.org/abs/2211.05238
In this paper we prove the first quantitative convergence rates for the graph infinity Laplace equation for length scales at the connectivity threshold. In the graph-based semi-supervised learning community this equation is also known as Lipschitz le
Externí odkaz:
http://arxiv.org/abs/2210.09023
Publikováno v:
IMA Journal of Numerical Analysis, 2022
Lipschitz learning is a graph-based semi-supervised learning method where one extends labels from a labeled to an unlabeled data set by solving the infinity Laplace equation on a weighted graph. In this work we prove uniform convergence rates for sol
Externí odkaz:
http://arxiv.org/abs/2111.12370
We propose a novel strategy for Neural Architecture Search (NAS) based on Bregman iterations. Starting from a sparse neural network our gradient-based one-shot algorithm gradually adds relevant parameters in an inverse scale space manner. This allows
Externí odkaz:
http://arxiv.org/abs/2106.02479
Publikováno v:
Journal of Machine Learning Research, 23(192), 1-43, 2022
We propose a learning framework based on stochastic Bregman iterations, also known as mirror descent, to train sparse neural networks with an inverse scale space approach. We derive a baseline algorithm called LinBreg, an accelerated version using mo
Externí odkaz:
http://arxiv.org/abs/2105.04319
Publikováno v:
International Conference on Scale Space and Variational Methods in Computer Vision, 307-319, 2021
Despite the large success of deep neural networks (DNN) in recent years, most neural networks still lack mathematical guarantees in terms of stability. For instance, DNNs are vulnerable to small or even imperceptible input perturbations, so called ad
Externí odkaz:
http://arxiv.org/abs/2103.12531