Zobrazeno 1 - 10
of 39
pro vyhledávání: '"Esposito, Amedeo Roberto"'
Information measures can be constructed from R\'enyi divergences much like mutual information from Kullback-Leibler divergence. One such information measure is known as Sibson's $\alpha$-mutual information and has received renewed attention recently
Externí odkaz:
http://arxiv.org/abs/2405.08352
Strong data processing inequalities (SDPI) are an important object of study in Information Theory and have been well studied for $f$-divergences. Universal upper and lower bounds have been provided along with several applications, connecting them to
Externí odkaz:
http://arxiv.org/abs/2403.10656
We introduce a novel concept of convergence for Markovian processes within Orlicz spaces, extending beyond the conventional approach associated with $L_p$ spaces. After showing that Markovian operators are contractive in Orlicz spaces, our key techni
Externí odkaz:
http://arxiv.org/abs/2402.11200
This paper focuses on parameter estimation and introduces a new method for lower bounding the Bayesian risk. The method allows for the use of virtually \emph{any} information measure, including R\'enyi's $\alpha$, $\varphi$-Divergences, and Sibson's
Externí odkaz:
http://arxiv.org/abs/2303.12497
We propose a novel approach to concentration for non-independent random variables. The main idea is to ``pretend'' that the random variables are independent and pay a multiplicative price measuring how far they are from actually being independent. Th
Externí odkaz:
http://arxiv.org/abs/2303.07245
We adopt an information-theoretic framework to analyze the generalization behavior of the class of iterative, noisy learning algorithms. This class is particularly suitable for study under information-theoretic metrics as the algorithms are inherentl
Externí odkaz:
http://arxiv.org/abs/2302.14518
In this work, we connect the problem of bounding the expected generalisation error with transportation-cost inequalities. Exposing the underlying pattern behind both approaches we are able to generalise them and go beyond Kullback-Leibler Divergences
Externí odkaz:
http://arxiv.org/abs/2202.03956
We explore a family of information measures that stems from R\'enyi's $\alpha$-Divergences with $\alpha<0$. In particular, we extend the definition of Sibson's $\alpha$-Mutual Information to negative values of $\alpha$ and show several properties of
Externí odkaz:
http://arxiv.org/abs/2202.03951
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of $f$-Divergences. The results are then applied to specific settings of interest and compared to other notable r
Externí odkaz:
http://arxiv.org/abs/2202.02557
In this work, we analyse how to define a conditional version of Sibson's $\alpha$-Mutual Information. Several such definitions can be advanced and they all lead to different information measures with different (but similar) operational meanings. We w
Externí odkaz:
http://arxiv.org/abs/2102.00720