Zobrazeno 1 - 10
of 128
pro vyhledávání: '"P. P. Saratchandran"'
This paper tackles the simultaneous optimization of pose and Neural Radiance Fields (NeRF). Departing from the conventional practice of using explicit global representations for camera pose, we propose a novel overparameterized representation that mo
Externí odkaz:
http://arxiv.org/abs/2407.12354
Low-rank decomposition has emerged as a vital tool for enhancing parameter efficiency in neural network architectures, gaining traction across diverse applications in machine learning. These techniques significantly lower the number of parameters, st
Externí odkaz:
http://arxiv.org/abs/2403.19243
Deep implicit functions have been found to be an effective tool for efficiently encoding all manner of natural signals. Their attractiveness stems from their ability to compactly represent signals with little to no offline training data. Instead, the
Externí odkaz:
http://arxiv.org/abs/2403.19163
Implicit neural representations have emerged as a powerful technique for encoding complex continuous multidimensional signals as neural networks, enabling a wide range of applications in computer vision, robotics, and geometry. While Adam is commonly
Externí odkaz:
http://arxiv.org/abs/2402.08784
Modelling dynamical systems is an integral component for understanding the natural world. To this end, neural networks are becoming an increasingly popular candidate owing to their ability to learn complex functions from large amounts of data. Despit
Externí odkaz:
http://arxiv.org/abs/2303.05728
Autor:
Hochs, Peter, Saratchandran, Hemanth
For proper group actions on smooth manifolds, with compact quotients, we define an equivariant version of the Ruelle dynamical $\zeta$-function for equivariant flows satisfying a nondegeneracy condition. The construction is guided by an equivariant g
Externí odkaz:
http://arxiv.org/abs/2303.00312
We introduce a general theoretical framework, designed for the study of gradient optimisation of deep neural networks, that encompasses ubiquitous architecture choices including batch normalisation, weight normalisation and skip connections. Our fram
Externí odkaz:
http://arxiv.org/abs/2210.05371
Autor:
Ramasinghe, Sameera, MacDonald, Lachlan, Farazi, Moshiur, Saratchandran, Hemanth, Lucey, Simon
Characterizing the remarkable generalization properties of over-parameterized neural networks remains an open problem. In this paper, we promote a shift of focus towards initialization rather than neural architecture or (stochastic) gradient descent
Externí odkaz:
http://arxiv.org/abs/2206.08558
Autor:
Hochs, Peter, Saratchandran, Hemanth
We construct an equivariant version of Ray-Singer analytic torsion for proper, isometric actions by locally compact groups on Riemannian manifolds, with compact quotients. We obtain results on convergence, metric independence, vanishing for even-dime
Externí odkaz:
http://arxiv.org/abs/2205.04117
Let $\mathcal{E}$ be a Hermitian vector bundle over a Riemannian manifold $M$ with metric $g$, let $\nabla$ be a metric covariant derivative on $\mathcal{E}$. We study the generalized Ornstein-Uhlenbeck differential expression $P^{\nabla}=\nabla^{\da
Externí odkaz:
http://arxiv.org/abs/2107.03301