Autor: |
Kielstra, P. Michael, Lindsey, Michael |
Rok vydání: |
2024 |
Předmět: |
|
Druh dokumentu: |
Working Paper |
Popis: |
Gaussian Process Regression (GPR) is widely used for inferring functions from noisy data. GPR crucially relies on the choice of a kernel, which might be specified in terms of a collection of hyperparameters that must be chosen or learned. Fully Bayesian GPR seeks to infer these kernel hyperparameters in a Bayesian sense, and the key computational challenge in sampling from their posterior distribution is the need for frequent determinant evaluations of large kernel matrices. This paper introduces a gradient-based, determinant-free approach for fully Bayesian GPR that combines a Gaussian integration trick for avoiding the determinant with Hamiltonian Monte Carlo (HMC) sampling. Our framework permits a matrix-free formulation and reduces the difficulty of dealing with hyperparameter gradients to a simple automatic differentiation. Our implementation is highly flexible and leverages GPU acceleration with linear-scaling memory footprint. Numerical experiments demonstrate the method's ability to scale gracefully to both high-dimensional hyperparameter spaces and large kernel matrices. |
Databáze: |
arXiv |
Externí odkaz: |
|