Zobrazeno 1 - 10
of 91
pro vyhledávání: '"Ringel, Zohar"'
Autor:
Lavie, Itay, Ringel, Zohar
Kernel ridge regression (KRR) and Gaussian processes (GPs) are fundamental tools in statistics and machine learning with recent applications to highly over-parameterized deep neural networks. The ability of these tools to learn a target function is d
Externí odkaz:
http://arxiv.org/abs/2406.02663
Autor:
Fischer, Kirsten, Lindner, Javed, Dahmen, David, Ringel, Zohar, Krämer, Michael, Helias, Moritz
A key property of neural networks driving their success is their ability to learn features from data. Understanding feature learning from a theoretical viewpoint is an emerging field with many open questions. In this work we capture finite-width effe
Externí odkaz:
http://arxiv.org/abs/2405.10761
Separating relevant and irrelevant information is key to any modeling process or scientific inquiry. Theoretical physics offers a powerful tool for achieving this in the form of the renormalization group (RG). Here we demonstrate a practical approach
Externí odkaz:
http://arxiv.org/abs/2405.06008
Publikováno v:
Proceedings of the 41 st International Conference on Machine Learning, Vienna, Austria. PMLR 235, 2024
We study inductive bias in Transformers in the infinitely over-parameterized Gaussian process limit and argue transformers tend to be biased towards more permutation symmetric functions in sequence space. We show that the representation theory of the
Externí odkaz:
http://arxiv.org/abs/2402.05173
A key property of deep neural networks (DNNs) is their ability to learn new features during training. This intriguing aspect of deep learning stands out most clearly in recently reported Grokking phenomena. While mainly reflected as a sudden increase
Externí odkaz:
http://arxiv.org/abs/2310.03789
State-of-the-art neural networks require extreme computational power to train. It is therefore natural to wonder whether they are optimally trained. Here we apply a recent advancement in stochastic thermodynamics which allows bounding the speed at wh
Externí odkaz:
http://arxiv.org/abs/2307.14653
Physically informed neural networks (PINNs) are a promising emerging method for solving differential equations. As in many other deep learning approaches, the choice of PINN design and training protocol requires careful craftsmanship. Here, we sugges
Externí odkaz:
http://arxiv.org/abs/2307.06362
Autor:
Gökmen, Doruk Efe, Biswas, Sounak, Huber, Sebastian D., Ringel, Zohar, Flicker, Felix, Koch-Janusz, Maciej
The physics of complex systems stands to greatly benefit from the qualitative changes in data availability and advances in data-driven computational methods. Many of these systems can be represented by interacting degrees of freedom on inhomogeneous
Externí odkaz:
http://arxiv.org/abs/2301.11934
Deep neural networks (DNNs) are powerful tools for compressing and distilling information. Their scale and complexity, often involving billions of inter-dependent parameters, render direct microscopic analysis difficult. Under such circumstances, a c
Externí odkaz:
http://arxiv.org/abs/2112.15383
Autor:
Naveh, Gadi, Ringel, Zohar
Deep neural networks (DNNs) in the infinite width/channel limit have received much attention recently, as they provide a clear analytical window to deep learning via mappings to Gaussian Processes (GPs). Despite its theoretical appeal, this viewpoint
Externí odkaz:
http://arxiv.org/abs/2106.04110