Zobrazeno 1 - 10
of 162
pro vyhledávání: '"Marchetti, Giovanni"'
We study convolutional neural networks with monomial activation functions. Specifically, we prove that their parameterization map is regular and is an isomorphism almost everywhere, up to rescaling the filters. By leveraging on tools from algebraic g
Externí odkaz:
http://arxiv.org/abs/2410.00722
Relative representations are an established approach to zero-shot model stitching, consisting of a non-trainable transformation of the latent space of a deep neural network. Based on insights of topological and geometric nature, we propose two improv
Externí odkaz:
http://arxiv.org/abs/2409.10967
Autor:
García-Castellanos, Alejandro, Medbouhi, Aniss Aiman, Marchetti, Giovanni Luca, Bekkers, Erik J., Kragic, Danica
We propose HyperSteiner -- an efficient heuristic algorithm for computing Steiner minimal trees in the hyperbolic space. HyperSteiner extends the Euclidean Smith-Lee-Liebman algorithm, which is grounded in a divide-and-conquer approach involving the
Externí odkaz:
http://arxiv.org/abs/2409.05671
We consider function spaces defined by self-attention networks without normalization, and theoretically analyze their geometry. Since these networks are polynomial, we rely on tools from algebraic geometry. In particular, we study the identifiability
Externí odkaz:
http://arxiv.org/abs/2408.17221
Autor:
Medbouhi, Aniss Aiman, Marchetti, Giovanni Luca, Polianskii, Vladislav, Kravberg, Alexander, Poklukar, Petra, Varava, Anastasia, Kragic, Danica
Hyperbolic machine learning is an emerging field aimed at representing data with a hierarchical structure. However, there is a lack of tools for evaluation and analysis of the resulting hyperbolic data representations. To this end, we propose Hyperbo
Externí odkaz:
http://arxiv.org/abs/2404.08608
In this work, we formally prove that, under certain conditions, if a neural network is invariant to a finite group then its weights recover the Fourier transform on that group. This provides a mathematical explanation for the emergence of Fourier fea
Externí odkaz:
http://arxiv.org/abs/2312.08550
Lattice reduction is a combinatorial optimization problem aimed at finding the most orthogonal basis in a given lattice. In this work, we address lattice reduction via deep learning methods. We design a deep neural model outputting factorized unimodu
Externí odkaz:
http://arxiv.org/abs/2311.08170
We address the problem of learning representations from observations of a scene involving an agent and an external object the agent interacts with. To this end, we propose a representation learning framework extracting the location in physical space
Externí odkaz:
http://arxiv.org/abs/2309.05346
Autor:
Rey, Luis Armando Pérez, Marchetti, Giovanni Luca, Kragic, Danica, Jarnikov, Dmitri, Holenderski, Mike
We introduce Equivariant Isomorphic Networks (EquIN) -- a method for learning representations that are equivariant with respect to general group actions over data. Differently from existing equivariant representation learners, EquIN is suitable for g
Externí odkaz:
http://arxiv.org/abs/2301.05231
Autor:
Marchetti, Giovanni Luca, Polianskii, Vladislav, Varava, Anastasiia, Pokorny, Florian T., Kragic, Danica
We introduce a non-parametric density estimator deemed Radial Voronoi Density Estimator (RVDE). RVDE is grounded in the geometry of Voronoi tessellations and as such benefits from local geometric adaptiveness and broad convergence properties. Due to
Externí odkaz:
http://arxiv.org/abs/2210.03964