Zobrazeno 1 - 10
of 14
pro vyhledávání: '"Toenshoff, Jan"'
Publikováno v:
The Twelfth International Conference on Learning Representations (2024)
Graph Transformers (GTs) such as SAN and GPS are graph processing models that combine Message-Passing GNNs (MPGNNs) with global Self-Attention. They were shown to be universal function approximators, with two reservations: 1. The initial node feature
Externí odkaz:
http://arxiv.org/abs/2405.11951
Publikováno v:
CIKM 2023
Machinery for data analysis often requires a numeric representation of the input. Towards that, a common practice is to embed components of structured data into a high-dimensional vector space. We study the embedding of the tuples of a relational dat
Externí odkaz:
http://arxiv.org/abs/2401.11215
The recent Long-Range Graph Benchmark (LRGB, Dwivedi et al. 2022) introduced a set of graph learning tasks strongly dependent on long-range interaction between vertices. Empirical evidence suggests that on these tasks Graph Transformers significantly
Externí odkaz:
http://arxiv.org/abs/2309.00367
Transformers have become the primary architecture for natural language processing. In this study, we explore their use for auto-regressive density estimation in high-energy jet physics, which involves working with a high-dimensional space. We draw an
Externí odkaz:
http://arxiv.org/abs/2303.07364
The expressivity of Graph Neural Networks (GNNs) is dependent on the aggregation functions they employ. Theoretical works have pointed towards Sum aggregation GNNs subsuming every other GNNs, while certain practical works have observed a clear advant
Externí odkaz:
http://arxiv.org/abs/2302.11603
Recently, many works studied the expressive power of graph neural networks (GNNs) by linking it to the $1$-dimensional Weisfeiler--Leman algorithm ($1\text{-}\mathsf{WL}$). Here, the $1\text{-}\mathsf{WL}$ is a well-studied heuristic for the graph is
Externí odkaz:
http://arxiv.org/abs/2301.11039
We propose a universal Graph Neural Network architecture which can be trained as an end-2-end search heuristic for any Constraint Satisfaction Problem (CSP). Our architecture can be trained unsupervised with policy gradient descent to generate proble
Externí odkaz:
http://arxiv.org/abs/2208.10227
We study the problem of computing an embedding of the tuples of a relational database in a manner that is extensible to dynamic changes of the database. In this problem, the embedding should be stable in the sense that it should not change on the exi
Externí odkaz:
http://arxiv.org/abs/2103.06766
We propose CRaWl, a novel neural network architecture for graph learning. Like graph neural networks, CRaWl layers update node features on a graph and thus can freely be combined or interleaved with GNN layers. Yet CRaWl operates fundamentally differ
Externí odkaz:
http://arxiv.org/abs/2102.08786
Many combinatorial optimization problems can be phrased in the language of constraint satisfaction problems. We introduce a graph neural network architecture for solving such optimization problems. The architecture is generic; it works for all binary
Externí odkaz:
http://arxiv.org/abs/1909.08387