Popis: |
Let $X\in \mathbb{R}^p$ and $Y\in \mathbb{R}$ be two random variables. We estimate the conditional covariance matrix $\mathrm{Cov}\left(\mathrm{E}\left[\boldsymbol{X}\vert Y\right]\right)$ applying a plug-in kernel-based algorithm to its entries. Next, we investigate the estimators rate of convergence under smoothness hypotheses on the density function of $(\boldsymbol{X},Y)$. In a high-dimensional context, we improve the consistency the whole matrix estimator by providing a decreasing structure over the $\mathrm{Cov}\left(\mathrm{E}\left[\boldsymbol{X}\vert Y\right]\right)$ entries. We illustrate a sliced inverse regression setting for time series matching the conditions of our estimator |