Formal Synthesis of Lyapunov Neural Networks
Autor: | Abate, A, Ahmed, D, Giacobbe, M, Peruffo, A |
---|---|
Rok vydání: | 2021 |
Předmět: |
FOS: Computer and information sciences
Lyapunov function Computer Science - Machine Learning Computer Science - Logic in Computer Science 0209 industrial biotechnology Polynomial Control and Optimization Theoretical computer science Computer science Systems and Control (eess.SY) 02 engineering and technology Electrical Engineering and Systems Science - Systems and Control 01 natural sciences Machine Learning (cs.LG) Set (abstract data type) symbols.namesake 020901 industrial engineering & automation Exponential stability Satisfiability modulo theories FOS: Electrical engineering electronic engineering information engineering 0101 mathematics Soundness Artificial neural network 010102 general mathematics Function (mathematics) Logic in Computer Science (cs.LO) Control and Systems Engineering symbols |
Zdroj: | IEEE Control Systems Letters. 5:773-778 |
ISSN: | 2475-1456 |
DOI: | 10.1109/lcsys.2020.3005328 |
Popis: | We propose an automatic and formally sound method for synthesising Lyapunov functions for the asymptotic stability of autonomous non-linear systems. Traditional methods are either analytical and require manual effort or are numerical but lack of formal soundness. Symbolic computational methods for Lyapunov functions, which are in between, give formal guarantees but are typically semi-automatic because they rely on the user to provide appropriate function templates. We propose a method that finds Lyapunov functions fully automatically$-$using machine learning$-$while also providing formal guarantees$-$using satisfiability modulo theories (SMT). We employ a counterexample-guided approach where a numerical learner and a symbolic verifier interact to construct provably correct Lyapunov neural networks (LNNs). The learner trains a neural network that satisfies the Lyapunov criteria for asymptotic stability over a samples set; the verifier proves via SMT solving that the criteria are satisfied over the whole domain or augments the samples set with counterexamples. Our method supports neural networks with polynomial activation functions and multiple depth and width, which display wide learning capabilities. We demonstrate our method over several non-trivial benchmarks and compare it favourably against a numerical optimisation-based approach, a symbolic template-based approach, and a cognate LNN-based approach. Our method synthesises Lyapunov functions faster and over wider spatial domains than the alternatives, yet providing stronger or equal guarantees. |
Databáze: | OpenAIRE |
Externí odkaz: |