Zobrazeno 1 - 10
of 78
pro vyhledávání: '"Bekkers, Erik J"'
Autor:
García-Castellanos, Alejandro, Medbouhi, Aniss Aiman, Marchetti, Giovanni Luca, Bekkers, Erik J., Kragic, Danica
We propose HyperSteiner -- an efficient heuristic algorithm for computing Steiner minimal trees in the hyperbolic space. HyperSteiner extends the Euclidean Smith-Lee-Liebman algorithm, which is grounded in a divide-and-conquer approach involving the
Externí odkaz:
http://arxiv.org/abs/2409.05671
Autor:
Ranum, Oline, Wessels, David R., Otterspeer, Gomer, Bekkers, Erik J., Roelofsen, Floris, Andersen, Jari I.
Sign Language Processing (SLP) provides a foundation for a more inclusive future in language technology; however, the field faces several significant challenges that must be addressed to achieve practical, real-world applications. This work addresses
Externí odkaz:
http://arxiv.org/abs/2409.15284
We show that the gradient of the cosine similarity between two points goes to zero in two under-explored settings: (1) if a point has large magnitude or (2) if the points are on opposite ends of the latent space. Counterintuitively, we prove that opt
Externí odkaz:
http://arxiv.org/abs/2406.16468
Autor:
Knigge, David M., Wessels, David R., Valperga, Riccardo, Papa, Samuele, Sonke, Jan-Jakob, Gavves, Efstratios, Bekkers, Erik J.
Recently, Conditional Neural Fields (NeFs) have emerged as a powerful modelling paradigm for PDEs, by learning solutions as flows in the latent space of the Conditional NeF. Although benefiting from favourable properties of NeFs such as grid-agnostic
Externí odkaz:
http://arxiv.org/abs/2406.06660
Autor:
Wessels, David R, Knigge, David M, Papa, Samuele, Valperga, Riccardo, Vadgama, Sharvaree, Gavves, Efstratios, Bekkers, Erik J
Conditional Neural Fields (CNFs) are increasingly being leveraged as continuous signal representations, by associating each data-sample with a latent variable that conditions a shared backbone Neural Field (NeF) to reconstruct the sample. However, ex
Externí odkaz:
http://arxiv.org/abs/2406.05753
This paper introduces E(n) Equivariant Message Passing Cellular Networks (EMPCNs), an extension of E(n) Equivariant Graph Neural Networks to CW-complexes. Our approach addresses two aspects of geometric message passing networks: 1) enhancing their ex
Externí odkaz:
http://arxiv.org/abs/2406.03145
Systems of interacting objects often evolve under the influence of field effects that govern their dynamics, yet previous works have abstracted away from such effects, and assume that systems evolve in a vacuum. In this work, we focus on discovering
Externí odkaz:
http://arxiv.org/abs/2310.20679
Autor:
Bekkers, Erik J, Vadgama, Sharvaree, Hesselink, Rob D, van der Linden, Putri A, Romero, David W
Based on the theory of homogeneous spaces we derive geometrically optimal edge attributes to be used within the flexible message-passing framework. We formalize the notion of weight sharing in convolutional networks as the sharing of message function
Externí odkaz:
http://arxiv.org/abs/2310.02970
In this paper, we investigate properties and limitations of invariance learned by neural networks from the data compared to the genuine invariance achieved through invariant weight-tying. To do so, we adopt a group theoretical perspective and analyze
Externí odkaz:
http://arxiv.org/abs/2308.03904
Neural operations that rely on neighborhood information are much more expensive when deployed on point clouds than on grid data due to the irregular distances between points in a point cloud. In a grid, on the other hand, we can compute the kernel on
Externí odkaz:
http://arxiv.org/abs/2307.14354