Zobrazeno 1 - 10
of 175
pro vyhledávání: '"Petreczky, Mihaly"'
Many state-of-the-art models trained on long-range sequences, for example S4, S5 or LRU, are made of sequential blocks combining State-Space Models (SSMs) with neural networks. In this paper we provide a PAC bound that holds for these kind of archite
Externí odkaz:
http://arxiv.org/abs/2405.20278
One of the main theoretical challenges in learning dynamical systems from data is providing upper bounds on the generalization error, that is, the difference between the expected prediction error and the empirical prediction error measured on some fi
Externí odkaz:
http://arxiv.org/abs/2405.10054
In this paper, we consider stochastic realization theory of Linear Switched Systems (LSS) with i.i.d. switching. We characterize minimality of stochastic LSSs and show existence and uniqueness (up to isomorphism) of minimal LSSs in innovation form. W
Externí odkaz:
http://arxiv.org/abs/2403.14259
In this paper, we study a class of stochastic Generalized Linear Switched System (GLSS), which includes subclasses of jump-Markov, piecewide-linear and Linear Parameter-Varying (LPV) systems. We prove that the output of such systems can be decomposed
Externí odkaz:
http://arxiv.org/abs/2403.11012
Publikováno v:
AAAI, vol. 38, no. 11, pp. 11901-11909, Mar. 2024
In this paper, we derive a PAC-Bayes bound on the generalisation gap, in a supervised time-series setting for a special class of discrete-time non-linear dynamical systems. This class includes stable recurrent neural networks (RNN), and the motivatio
Externí odkaz:
http://arxiv.org/abs/2312.09793
This work brings together the moment matching approach based on Loewner functions and the classical Loewner framework based on the Loewner pencil in the case of bilinear systems. New Loewner functions are defined based on the bilinear Loewner framewo
Externí odkaz:
http://arxiv.org/abs/2311.06125
Recent advances in deep learning have given us some very promising results on the generalization ability of deep neural networks, however literature still lacks a comprehensive theory explaining why heavily over-parametrized models are able to genera
Externí odkaz:
http://arxiv.org/abs/2310.17378
In this work, we examine Asymmetric Shapley Values (ASV), a variant of the popular SHAP additive local explanation method. ASV proposes a way to improve model explanations incorporating known causal relations between variables, and is also considered
Externí odkaz:
http://arxiv.org/abs/2310.09961
We consider the problem of learning Neural Ordinary Differential Equations (neural ODEs) within the context of Linear Parameter-Varying (LPV) systems in continuous-time. LPV systems contain bilinear systems which are known to be universal approximato
Externí odkaz:
http://arxiv.org/abs/2307.03630
The paper makes the first steps towards a behavioral theory of LPV state-space representations with an affine dependency on scheduling, by characterizing minimality of such state-space representations. It is shown that minimality is equivalent to obs
Externí odkaz:
http://arxiv.org/abs/2305.08508