Zobrazeno 1 - 10
of 35
pro vyhledávání: '"Banerjee, Pradeep Kr."'
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph. While this allows GNNs to learn features depending on the graph structure, for certain graph topologies it leads to ineffi
Externí odkaz:
http://arxiv.org/abs/2210.11790
The quality of signal propagation in message-passing graph neural networks (GNNs) strongly influences their expressivity as has been observed in recent works. In particular, for prediction tasks relying on long-range interactions, recursive aggregati
Externí odkaz:
http://arxiv.org/abs/2208.03471
Publikováno v:
International Journal of Approximate Reasoning, 2023
Information decompositions quantify how the Shannon information about a given random variable is distributed among several other random variables. Various requirements have been proposed that such a decomposition should satisfy, leading to different
Externí odkaz:
http://arxiv.org/abs/2204.10982
We characterize the power-law asymptotics of learning curves for Gaussian process regression (GPR) under the assumption that the eigenspectrum of the prior and the eigenexpansion coefficients of the target function follow a power law. Under similar a
Externí odkaz:
http://arxiv.org/abs/2110.12231
Autor:
Banerjee, Pradeep Kr., Montúfar, Guido
We present a unifying picture of PAC-Bayesian and mutual information-based upper bounds on the generalization error of randomized learning algorithms. As we show, Tong Zhang's information exponential inequality (IEI) gives a general recipe for constr
Externí odkaz:
http://arxiv.org/abs/2105.01747
Publikováno v:
In International Journal of Approximate Reasoning October 2023 161
Publikováno v:
IEEE International Symposium on Information Theory (ISIT) 2019
The unique information ($UI$) is an information measure that quantifies a deviation from the Blackwell order. We have recently shown that this quantity is an upper bound on the one-way secret key rate. In this paper, we prove a triangle inequality fo
Externí odkaz:
http://arxiv.org/abs/1901.08007
Autor:
Banerjee, Pradeep Kr., Montúfar, Guido
We introduce a bottleneck method for learning data representations based on information deficiency, rather than the more traditional information sufficiency. A variational upper bound allows us to implement this method efficiently. The bound itself i
Externí odkaz:
http://arxiv.org/abs/1810.11677
Given two channels that convey information about the same random variable, we introduce two measures of the unique information of one channel with respect to the other. The two quantities are based on the notion of generalized weighted Le Cam deficie
Externí odkaz:
http://arxiv.org/abs/1807.05103
Given a pair of predictor variables and a response variable, how much information do the predictors have about the response, and how is this information distributed between unique, redundant, and synergistic components? Recent work has proposed to qu
Externí odkaz:
http://arxiv.org/abs/1709.07487