Popis: |
Uncovering causal interdependencies from observational data is one of the great challenges of nonlinear time series analysis. In this paper, we discuss this topic with the help of information-theoretic concept known as R\'enyi information measure. In particular, we tackle the directional information flow between bivariate time series in terms of R\'enyi transfer entropy. We show that by choosing R\'enyi $\alpha$ parameter appropriately we can control information that is transferred only between selected parts of underlying distributions. This, in turn, provides particularly potent tool for quantifying causal interdependencies in time series, where the knowledge of "black swan" events such as spikes or sudden jumps are of a key importance. In this connection, we first prove that for Gaussian variables, Granger causality and R\'enyi transfer entropy are entirely equivalent. Moreover, we also partially extend this results to heavy-tailed $\alpha$-Gaussian variables. These results allow to establish connection between autoregressive and R\'enyi entropy based information-theoretic approaches to data-driven causal inference. To aid our intuition we employ Leonenko et al. entropy estimator and analyze R\'enyi information flow between bivariate time series generated from two unidirectionally coupled R\"ossler systems. Notably, we find that R\'enyi transfer entropy not only allowed us to detect a threshold of synchronization but it also provided a non-trivial insight into the structure of a transient regime that exists between region of chaotic correlations and synchronization threshold. In addition, from R\'enyi transfer entropy we could reliably infer the direction of coupling - and hence causality, only for coupling strengths smaller that the onset value of transient regime, i.e. when two R\"ossler systems were coupled, but have not yet entered a synchronization. |