Zobrazeno 1 - 10
of 141
pro vyhledávání: '"Chamroukhi , Faicel"'
Autor:
Chamroukhi, Faïcel, Pham, Nhat Thien
In modern machine learning problems we deal with datasets that are either distributed by nature or potentially large for which distributing the computations is usually a standard way to proceed, since centralized algorithms are in general ineffective
Externí odkaz:
http://arxiv.org/abs/2312.09877
Publikováno v:
Proceedings of Machine Learning Research 204:1-20, 2023 Conformal and Probabilistic Prediction with Applications
An important mathematical tool in the analysis of dynamical systems is the approximation of the reach set, i.e., the set of states reachable after a given time from a given initial state. This set is difficult to compute for complex systems even if t
Externí odkaz:
http://arxiv.org/abs/2309.08976
Autor:
Pham, Nhat Thien, Chamroukhi, Faicel
We develop a mixtures-of-experts (ME) approach to the multiclass classification where the predictors are univariate functions. It consists of a ME model in which both the gating network and the experts network are constructed upon multinomial logisti
Externí odkaz:
http://arxiv.org/abs/2202.13934
We consider the statistical analysis of heterogeneous data for prediction in situations where the observations include functions, typically time series. We extend the modeling with Mixtures-of-Experts (ME), as a framework of choice in modeling hetero
Externí odkaz:
http://arxiv.org/abs/2202.02249
Dual-energy computed tomography (DECT) is an advanced CT scanning technique enabling material characterization not possible with conventional CT scans. It allows the reconstruction of energy decay curves at each 3D image voxel, representing varying i
Externí odkaz:
http://arxiv.org/abs/2201.13398
Model selection, via penalized likelihood type criteria, is a standard task in many statistical inference and machine learning problems. Progress has led to deriving criteria with asymptotic consistency results and an increasing emphasis on introduci
Externí odkaz:
http://arxiv.org/abs/2104.08959
Publikováno v:
Electronic Journal of Statistics 2022
Mixture of experts (MoE) are a popular class of statistical and machine learning models that have gained attention over the years due to their flexibility and efficiency. In this work, we consider Gaussian-gated localized MoE (GLoME) and block-diagon
Externí odkaz:
http://arxiv.org/abs/2104.02640
Publikováno v:
Journal of Statistical Distributions and Applications, 8, Article number: 13 (2021)
Mixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are
Externí odkaz:
http://arxiv.org/abs/2012.02385
We investigate the estimation properties of the mixture of experts (MoE) model in a high-dimensional setting, where the number of predictors is much larger than the sample size, and for which the literature is particularly lacking in theoretical resu
Externí odkaz:
http://arxiv.org/abs/2009.10622
Approximation of probability density functions via location-scale finite mixtures in Lebesgue spaces
The class of location-scale finite mixtures is of enduring interest both from applied and theoretical perspectives of probability and statistics. We prove the following results: to an arbitrary degree of accuracy, (a) location-scale mixtures of a con
Externí odkaz:
http://arxiv.org/abs/2008.09787