Zobrazeno 1 - 10
of 147
pro vyhledávání: '"Hirn, Matthew"'
Motivated by modern data applications such as cryo-electron microscopy, the goal of classic multi-reference alignment (MRA) is to recover an unknown signal $f: \mathbb{R} \to \mathbb{R}$ from many observations that have been randomly translated and c
Externí odkaz:
http://arxiv.org/abs/2402.14276
For deep learning problems on graph-structured data, pooling layers are important for down sampling, reducing computational cost, and to minimize overfitting. We define a pooling layer, NervePool, for data structured as simplicial complexes, which ar
Externí odkaz:
http://arxiv.org/abs/2305.06315
In this paper, we generalize finite depth wavelet scattering transforms, which we formulate as $\Lb^q(\mathbb{R}^n)$ norms of a cascade of continuous wavelet transforms (or dyadic wavelet transforms) and contractive nonlinearities. We then provide no
Externí odkaz:
http://arxiv.org/abs/2209.05038
Autor:
Chew, Joyce, Hirn, Matthew, Krishnaswamy, Smita, Needell, Deanna, Perlmutter, Michael, Steach, Holly, Viswanath, Siddharth, Wu, Hau-Tieng
The scattering transform is a multilayered, wavelet-based transform initially introduced as a model of convolutional neural networks (CNNs) that has played a foundational role in our understanding of these networks' stability and invariance propertie
Externí odkaz:
http://arxiv.org/abs/2208.08561
Autor:
Chew, Joyce, Steach, Holly R., Viswanath, Siddharth, Wu, Hau-Tieng, Hirn, Matthew, Needell, Deanna, Krishnaswamy, Smita, Perlmutter, Michael
The manifold scattering transform is a deep feature extractor for data defined on a Riemannian manifold. It is one of the first examples of extending convolutional neural network-like operators to general manifolds. The initial work on this model foc
Externí odkaz:
http://arxiv.org/abs/2206.10078
Autor:
Liu, Renming, Cantürk, Semih, Wenkel, Frederik, McGuire, Sarah, Wang, Xinyi, Little, Anna, O'Bray, Leslie, Perlmutter, Michael, Rieck, Bastian, Hirn, Matthew, Wolf, Guy, Rampášek, Ladislav
Graph Neural Networks (GNNs) extend the success of neural networks to graph-structured data by accounting for their intrinsic geometry. While extensive research has been done on developing GNN models with superior performance according to a collectio
Externí odkaz:
http://arxiv.org/abs/2206.07729
Autor:
Huguet, Guillaume, Tong, Alexander, Rieck, Bastian, Huang, Jessie, Kuchroo, Manik, Hirn, Matthew, Wolf, Guy, Krishnaswamy, Smita
Diffusion condensation is a dynamic process that yields a sequence of multiscale data representations that aim to encode meaningful abstractions. It has proven effective for manifold learning, denoising, clustering, and visualization of high-dimensio
Externí odkaz:
http://arxiv.org/abs/2203.14860
Geometric deep learning has made great strides towards generalizing the design of structure-aware neural networks from traditional domains to non-Euclidean ones, giving rise to graph neural networks (GNN) that can be applied to graph-structured data
Externí odkaz:
http://arxiv.org/abs/2201.08932
Autor:
Liu, Renming, Cantürk, Semih, Wenkel, Frederik, Sandfelder, Dylan, Kreuzer, Devin, Little, Anna, McGuire, Sarah, O'Bray, Leslie, Perlmutter, Michael, Rieck, Bastian, Hirn, Matthew, Wolf, Guy, Rampášek, Ladislav
Graph neural networks (GNNs) have attracted much attention due to their ability to leverage the intrinsic geometries of the underlying data. Although many different types of GNN models have been developed, with many benchmarking procedures to demonst
Externí odkaz:
http://arxiv.org/abs/2110.14809