Zobrazeno 1 - 10
of 105
pro vyhledávání: '"Kusch, Jonas"'
Autor:
Schotthöfer, Steffen, Zangrando, Emanuele, Ceruti, Gianluca, Tudisco, Francesco, Kusch, Jonas
Low-Rank Adaptation (LoRA) has become a widely used method for parameter-efficient fine-tuning of large-scale, pre-trained neural networks. However, LoRA and its extensions face several challenges, including the need for rank adaptivity, robustness,
Externí odkaz:
http://arxiv.org/abs/2410.18720
Autor:
Kusch, Jonas
Due to its reduced memory and computational demands, dynamical low-rank approximation (DLRA) has sparked significant interest in multiple research communities. A central challenge in DLRA is the development of time integrators that are robust to the
Externí odkaz:
http://arxiv.org/abs/2403.02834
The thermal radiative transfer equations model temperature evolution through a background medium as a result of radiation. When a large number of particles are absorbed in a short time scale, the dynamics tend to a non-linear diffusion-type equation
Externí odkaz:
http://arxiv.org/abs/2402.16746
Dynamical low-rank approximation has become a valuable tool to perform an on-the-fly model order reduction for prohibitively large matrix differential equations. A core ingredient is the construction of integrators that are robust to the presence of
Externí odkaz:
http://arxiv.org/abs/2402.08607
Numerical simulations of kinetic problems can become prohibitively expensive due to their large memory footprint and computational costs. A method that has proven to successfully reduce these costs is the dynamical low-rank approximation (DLRA). One
Externí odkaz:
http://arxiv.org/abs/2311.06399
Computational methods for thermal radiative transfer problems exhibit high computational costs and a prohibitive memory footprint when the spatial and directional domains are finely resolved. A strategy to reduce such computational costs is dynamical
Externí odkaz:
http://arxiv.org/abs/2307.07538
Autor:
Zangrando, Emanuele, Schotthöfer, Steffen, Ceruti, Gianluca, Kusch, Jonas, Tudisco, Francesco
Reducing parameter redundancies in neural network architectures is crucial for achieving feasible computational and memory requirements during training and inference phases. Given its easy implementation and flexibility, one promising approach is lay
Externí odkaz:
http://arxiv.org/abs/2305.19059
This work introduces a parallel and rank-adaptive matrix integrator for dynamical low-rank approximation. The method is related to the previously proposed rank-adaptive basis update & Galerkin (BUG) integrator but differs significantly in that all ar
Externí odkaz:
http://arxiv.org/abs/2304.05660
Geophysical flow simulations using hyperbolic shallow water moment equations require an efficient discretization of a potentially large system of PDEs, the so-called moment system. This calls for tailored model order reduction techniques that allow f
Externí odkaz:
http://arxiv.org/abs/2302.01391
Radiation transport problems are posed in a high-dimensional phase space, limiting the use of finely resolved numerical simulations. An emerging tool to efficiently reduce computational costs and memory footprint in such settings is dynamical low-ran
Externí odkaz:
http://arxiv.org/abs/2212.12012