Zobrazeno 1 - 10
of 22
pro vyhledávání: '"Revay, Max"'
Neural networks are typically sensitive to small input perturbations, leading to unexpected or brittle behaviour. We present RobustNeuralNetworks.jl: a Julia package for neural network models that are constructed to naturally satisfy a set of user-de
Externí odkaz:
http://arxiv.org/abs/2306.12612
This paper proposes a nonlinear policy architecture for control of partially-observed linear dynamical systems providing built-in closed-loop stability guarantees. The policy is based on a nonlinear version of the Youla parameterization, and augments
Externí odkaz:
http://arxiv.org/abs/2112.04219
This tutorial paper provides an introduction to recently developed tools for machine learning, especially learning dynamical systems (system identification), with stability and robustness constraints. The main ideas are drawn from contraction analysi
Externí odkaz:
http://arxiv.org/abs/2110.00207
This paper proposes methods for identification of large-scale networked systems with guarantees that the resulting model will be contracting -- a strong form of nonlinear stability -- and/or monotone, i.e. order relations between states are preserved
Externí odkaz:
http://arxiv.org/abs/2107.14309
This paper introduces recurrent equilibrium networks (RENs), a new class of nonlinear dynamical models} for applications in machine learning, system identification and control. The new model class admits ``built in'' behavioural guarantees of stabili
Externí odkaz:
http://arxiv.org/abs/2104.05942
This paper introduces new parameterizations of equilibrium neural networks, i.e. networks defined by implicit equations. This model class includes standard multilayer and residual networks as special cases. The new parameterization admits a Lipschitz
Externí odkaz:
http://arxiv.org/abs/2010.01732
Recurrent neural networks (RNNs) are a class of nonlinear dynamical systems often used to model sequence-to-sequence maps. RNNs have excellent expressive power but lack the stability or robustness guarantees that are necessary for many applications.
Externí odkaz:
http://arxiv.org/abs/2004.05290
Autor:
Revay, Max, Manchester, Ian R.
Stability of recurrent models is closely linked with trainability, generalizability and in some applications, safety. Methods that train stable recurrent neural networks, however, do so at a significant cost to expressibility. We propose an implicit
Externí odkaz:
http://arxiv.org/abs/1912.10402
This paper gives convex conditions for synthesis of a distributed control system for large-scale networked nonlinear dynamic systems. It is shown that the technique of control contraction metrics (CCMs) can be extended to this problem by utilizing se
Externí odkaz:
http://arxiv.org/abs/1810.04794
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.