Zobrazeno 1 - 8
of 8
pro vyhledávání: '"Alaa Maalouf"'
Publikováno v:
Sensors, Vol 21, Iss 19, p 6689 (2021)
Coreset is usually a small weighted subset of an input set of items, that provably approximates their loss function for a given set of queries (models, classifiers, hypothesis). That is, the maximum (worst-case) error over all queries is bounded. To
Externí odkaz:
https://doaj.org/article/26bcef9252b0428280e372df224f3a98
Publikováno v:
Sensors, Vol 21, Iss 16, p 5599 (2021)
A common technique for compressing a neural network is to compute the k-rank ℓ2 approximation Ak of the matrix A∈Rn×d via SVD that corresponds to a fully connected layer (or embedding layer). Here, d is the number of input neurons in the layer,
Externí odkaz:
https://doaj.org/article/18b56702a14f476393df979512e7125a
Publikováno v:
IEEE Transactions on Pattern Analysis and Machine Intelligence. 44:9977-9994
Least-mean-squares (LMS) solvers such as Linear / Ridge-Regression and SVD not only solve fundamental machine learning problems, but are also the building blocks in a variety of other methods, such as matrix factorizations. We suggest an algorithm th
Publikováno v:
IEEE transactions on neural networks and learning systems.
Coreset of a given dataset and loss function is usually a small weighed set that approximates this loss for every query from a given set of queries. Coresets have shown to be very useful in many applications. However, coresets construction is done in
Publikováno v:
2021 IEEE/CVF International Conference on Computer Vision (ICCV).
Publikováno v:
WIREs Data Mining and Knowledge Discovery. 11
Publikováno v:
Sensors
Volume 21
Issue 16
Sensors (Basel, Switzerland)
Sensors, Vol 21, Iss 5599, p 5599 (2021)
Volume 21
Issue 16
Sensors (Basel, Switzerland)
Sensors, Vol 21, Iss 5599, p 5599 (2021)
A common technique for compressing a neural network is to compute the k-rank ℓ2 approximation Ak of the matrix A∈Rn×d via SVD that corresponds to a fully connected layer (or embedding layer). Here, d is the number of input neurons in the layer,
Publikováno v:
KDD
An $\varepsilon$-coreset for Least-Mean-Squares (LMS) of a matrix $A\in{\mathbb{R}}^{n\times d}$ is a small weighted subset of its rows that approximates the sum of squared distances from its rows to every affine $k$-dimensional subspace of ${\mathbb