Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Esser, Pascal Mattia"'
The central question in representation learning is what constitutes a good or meaningful representation. In this work we argue that if we consider data with inherent cluster structures, where clusters can be characterized through different means and
Externí odkaz:
http://arxiv.org/abs/2212.01046
Autor:
Esser, Pascal Mattia, Nielsen, Frank
A common way to learn and analyze statistical models is to consider operations in the model parameter space. But what happens if we optimize in the parameter space and there is no one-to-one mapping between the parameter space and the underlying stat
Externí odkaz:
http://arxiv.org/abs/2206.08598
In recent years, several results in the supervised learning setting suggested that classical statistical learning-theoretic measures, such as VC dimension, do not adequately explain the performance of deep learning models which prompted a slew of wor
Externí odkaz:
http://arxiv.org/abs/2112.03968
Autor:
Esser, Pascal Mattia, Nielsen, Frank
When analyzing parametric statistical models, a useful approach consists in modeling geometrically the parameter space. However, even for very simple and commonly used hierarchical models like statistical mixtures or stochastic deep neural networks,
Externí odkaz:
http://arxiv.org/abs/2112.03734
The goal of clustering is to group similar objects into meaningful partitions. This process is well understood when an explicit similarity measure between the objects is given. However, far less is known when this information is not readily available
Externí odkaz:
http://arxiv.org/abs/2010.03918