Zobrazeno 1 - 10
of 274
pro vyhledávání: '"Gribonval, Remi"'
Analyzing the behavior of ReLU neural networks often hinges on understanding the relationships between their parameters and the functions they implement. This paper proves a new bound on function distances in terms of the so-called path-metrics of th
Externí odkaz:
http://arxiv.org/abs/2405.15006
Conservation laws are well-established in the context of Euclidean gradient flow dynamics, notably for linear or ReLU neural network training. Yet, their existence and principles for non-Euclidean geometries and momentum-based dynamics remain largely
Externí odkaz:
http://arxiv.org/abs/2405.12888
Autor:
Belhadji, Ayoub, Gribonval, Rémi
Compressive learning is an emerging approach to drastically reduce the memory footprint of large-scale learning, by first summarizing a large dataset into a low-dimensional sketch vector, and then decoding from this sketch the latent information need
Externí odkaz:
http://arxiv.org/abs/2312.09940
Autor:
Belhadji, Ayoub, Gribonval, Rémi
In the context of sketching for compressive mixture modeling, we revisit existing proofs of the Restricted Isometry Property of sketching operators with respect to certain mixtures models. After examining the shortcomings of existing guarantees, we p
Externí odkaz:
http://arxiv.org/abs/2312.05573
We consider the problem of learning a graph modeling the statistical relations of the $d$ variables from a dataset with $n$ samples $X \in \mathbb{R}^{n \times d}$. Standard approaches amount to searching for a precision matrix $\Theta$ representativ
Externí odkaz:
http://arxiv.org/abs/2311.04673
This work introduces the first toolkit around path-norms that fully encompasses general DAG ReLU networks with biases, skip connections and any operation based on the extraction of order statistics: max pooling, GroupSort etc. This toolkit notably al
Externí odkaz:
http://arxiv.org/abs/2310.01225
Pairwise temporal interactions between entities can be represented as temporal networks, which code the propagation of processes such as epidemic spreading or information cascades, evolving on top of them. The largest outcome of these processes is di
Externí odkaz:
http://arxiv.org/abs/2307.04890
Many matrices associated with fast transforms posess a certain low-rank property characterized by the existence of several block partitionings of the matrix, where each block is of low rank. Provided that these partitionings are known, there exist al
Externí odkaz:
http://arxiv.org/abs/2307.00820
Understanding the geometric properties of gradient descent dynamics is a key ingredient in deciphering the recent success of very large machine learning models. A striking observation is that trained over-parameterized models retain some properties o
Externí odkaz:
http://arxiv.org/abs/2307.00144
We study non-parametric density estimation for densities in Lipschitz and Sobolev spaces, and under central privacy. In particular, we investigate regimes where the privacy budget is not supposed to be constant. We consider the classical definition o
Externí odkaz:
http://arxiv.org/abs/2306.14535