Lazily Adapted Constant Kinky Inference for Nonparametric Regression and Model-Reference Adaptive Control

Autor: Jan M. Maciejowski, Stephen J. Roberts, Carl Edward Rasmussen, Jan-Peter Calliess
Přispěvatelé: Rasmussen, Carl [0000-0001-8899-7850], Maciejowski, Jan [0000-0001-8281-8364], Apollo - University of Cambridge Repository
Rok vydání: 2020
Předmět:
DOI: 10.17863/cam.54368
Popis: Techniques known as Nonlinear Set Membership prediction or Lipschitz Interpolation are approaches to supervised machine learning that utilise presupposed Lipschitz properties to perform inference over unobserved function values. Provided a bound on the true best Lipschitz constant of the target function is known a priori, they offer convergence guarantees, as well as bounds around the predictions. Considering a more general setting that builds on Lipschitz continuity, we propose an online method for estimating the Lipschitz constant online from function value observations that are possibly corrupted by bounded noise. Utilising this as a data-dependent hyper-parameter gives rise to a nonparametric machine learning method, for which we establish strong universal approximation guarantees. That is, we show that our prediction rule can learn any continuous function on compact support in the limit of increasingly dense data, up to a worst-case error that can be bounded by the level of observational error. We also consider applications of our nonparametric regression method to learning-based control. For a class of discrete-time settings, we establish convergence guarantees on the closed-loop tracking error of our online learning-based controllers. To provide evidence that our method can be beneficial not only in theory but also in practice, we apply it in the context of nonparametric model-reference adaptive control (MRAC). Across a range of simulated aircraft roll-dynamics and performance metrics our approach outperforms recently proposed alternatives that were based on Gaussian processes and RBF-neural networks.
Databáze: OpenAIRE