A deterministic and computable Bernstein-von Mises theorem

Autor: Dehaene, Guillaume P.
Rok vydání: 2019
Předmět:
Druh dokumentu: Working Paper
Popis: Bernstein-von Mises results (BvM) establish that the Laplace approximation is asymptotically correct in the large-data limit. However, these results are inappropriate for computational purposes since they only hold over most, and not all, datasets and involve hard-to-estimate constants. In this article, I present a new BvM theorem which bounds the Kullback-Leibler (KL) divergence between a fixed log-concave density $f\left(\boldsymbol{\theta}\right)$ and its Laplace approximation. The bound goes to $0$ as the higher-derivatives of $f\left(\boldsymbol{\theta}\right)$ tend to $0$ and $f\left(\boldsymbol{\theta}\right)$ becomes increasingly Gaussian. The classical BvM theorem in the IID large-data asymptote is recovered as a corollary. Critically, this theorem further suggests a number of computable approximations of the KL divergence with the most promising being: \[ KL\left(g_{LAP},f\right)\approx\frac{1}{2}\text{Var}_{\boldsymbol{\theta}\sim g\left(\boldsymbol{\theta}\right)}\left(\log\left[f\left(\boldsymbol{\theta}\right)\right]-\log\left[g_{LAP}\left(\boldsymbol{\theta}\right)\right]\right) \] An empirical investigation of these bounds in the logistic classification model reveals that these approximations are great surrogates for the KL divergence. This result, and future results of a similar nature, could provide a path towards rigorously controlling the error due to the Laplace approximation and more modern approximation methods.
Comment: The first version contained an incorrect claim in section 5.1 : in general the KL divergence does not bound the difference of the moments
Databáze: arXiv