Zobrazeno 1 - 10
of 110
pro vyhledávání: '"Perlmutter, Michael A."'
Autor:
Viswanath, Siddharth, Bhaskar, Dhananjay, Johnson, David R., Rocha, Joao Felipe, Castro, Egbert, Grady, Jackson D., Grigas, Alex T., Perlmutter, Michael A., O'Hern, Corey S., Krishnaswamy, Smita
Understanding the dynamic nature of protein structures is essential for comprehending their biological functions. While significant progress has been made in predicting static folded structures, modeling protein motions on microsecond to millisecond
Externí odkaz:
http://arxiv.org/abs/2410.20317
Autor:
Johnson, David R., Chew, Joyce, Viswanath, Siddharth, De Brouwer, Edward, Needell, Deanna, Krishnaswamy, Smita, Perlmutter, Michael
In order to better understand manifold neural networks (MNNs), we introduce Manifold Filter-Combine Networks (MFCNs). The filter-combine framework parallels the popular aggregate-combine paradigm for graph neural networks (GNNs) and naturally suggest
Externí odkaz:
http://arxiv.org/abs/2410.14639
Autor:
Sun, Xingzhi, Xu, Charles, Rocha, João F., Liu, Chen, Hollander-Bodie, Benjamin, Goldman, Laney, DiStasio, Marcello, Perlmutter, Michael, Krishnaswamy, Smita
In many data-driven applications, higher-order relationships among multiple objects are essential in capturing complex interactions. Hypergraphs, which generalize graphs by allowing edges to connect any number of nodes, provide a flexible and powerfu
Externí odkaz:
http://arxiv.org/abs/2409.09469
Graph neural networks (GNNs) have achieved great success for a variety of tasks such as node classification, graph classification, and link prediction. However, the use of GNNs (and machine learning more generally) to solve combinatorial optimization
Externí odkaz:
http://arxiv.org/abs/2405.20543
Here we consider the problem of denoising features associated to complex data, modeled as signals on a graph, via a smoothness prior. This is motivated in part by settings such as single-cell RNA where the data is very high-dimensional, but its struc
Externí odkaz:
http://arxiv.org/abs/2311.16378
Autor:
Xu, Charles, Goldman, Laney, Guo, Valentina, Hollander-Bodie, Benjamin, Trank-Greene, Maedee, Adelstein, Ian, De Brouwer, Edward, Ying, Rex, Krishnaswamy, Smita, Perlmutter, Michael
Publikováno v:
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4537-4545, 2024
Graph neural networks (GNNs) have emerged as a powerful tool for tasks such as node classification and graph classification. However, much less work has been done on signal classification, where the data consists of many functions (referred to as sig
Externí odkaz:
http://arxiv.org/abs/2310.17579
Autor:
Bhaskar, Dhananjay, Zhang, Yanlei, Xu, Charles, Sun, Xingzhi, Fasina, Oluwadamilola, Wolf, Guy, Nickel, Maximilian, Perlmutter, Michael, Krishnaswamy, Smita
In this paper we introduce DYMAG: a message passing paradigm for GNNs built on the expressive power of continuous, multiscale graph-dynamics. Standard discrete-time message passing algorithms implicitly make use of simplistic graph dynamics and aggre
Externí odkaz:
http://arxiv.org/abs/2309.09924
Autor:
Venkat, Aarthi, Chew, Joyce, Rodriguez, Ferran Cardoso, Tape, Christopher J., Perlmutter, Michael, Krishnaswamy, Smita
Directed graphs are a natural model for many phenomena, in particular scientific knowledge graphs such as molecular interaction or chemical reaction networks that define cellular signaling relationships. In these situations, source nodes typically ha
Externí odkaz:
http://arxiv.org/abs/2309.07813
Autor:
MacDonald, Kincaid, Bhaskar, Dhananjay, Thampakkul, Guy, Nguyen, Nhi, Zhang, Joia, Perlmutter, Michael, Adelstein, Ian, Krishnaswamy, Smita
We consider the problem of embedding point cloud data sampled from an underlying manifold with an associated flow or velocity. Such data arises in many contexts where static snapshots of dynamic entities are measured, including in high-throughput bio
Externí odkaz:
http://arxiv.org/abs/2308.00176
We introduce a class of manifold neural networks (MNNs) that we call Manifold Filter-Combine Networks (MFCNs), that aims to further our understanding of MNNs, analogous to how the aggregate-combine framework helps with the understanding of graph neur
Externí odkaz:
http://arxiv.org/abs/2307.04056