Zobrazeno 1 - 10
of 25
pro vyhledávání: '"Farnell, Elin"'
A support vector machine (SVM) is an algorithm that finds a hyperplane which optimally separates labeled data points in $\mathbb{R}^n$ into positive and negative classes. The data points on the margin of this separating hyperplane are called support
Externí odkaz:
http://arxiv.org/abs/2011.00617
Autor:
Kvinge, Henry, Farnell, Elin, Dupuis, Julia R., Kirby, Michael, Peterson, Chris, Schundler, Elizabeth C.
Compressive sensing (CS) is a method of sampling which permits some classes of signals to be reconstructed with high accuracy even when they were under-sampled. In this paper we explore a phenomenon in which bandwise CS sampling of a hyperspectral da
Externí odkaz:
http://arxiv.org/abs/1906.11818
Autor:
Farnell, Elin, Kvinge, Henry, Dupuis, Julia R., Kirby, Michael, Peterson, Chris, Schundler, Elizabeth C.
One of the fundamental assumptions of compressive sensing (CS) is that a signal can be reconstructed from a small number of samples by solving an optimization problem with the appropriate regularization term. Two standard regularization terms are the
Externí odkaz:
http://arxiv.org/abs/1906.10603
Autor:
Farnell, Elin, Kvinge, Henry, Dixon, John P., Dupuis, Julia R., Kirby, Michael, Peterson, Chris, Schundler, Elizabeth C., Smith, Christian W.
Sampling is a fundamental aspect of any implementation of compressive sensing. Typically, the choice of sampling method is guided by the reconstruction basis. However, this approach can be problematic with respect to certain hardware constraints and
Externí odkaz:
http://arxiv.org/abs/1906.08869
In many situations, classes of data points of primary interest also happen to be those that are least numerous. A well-known example is detection of fraudulent transactions among the collection of all financial transactions, the vast majority of whic
Externí odkaz:
http://arxiv.org/abs/1901.10585
Dimensionality-reduction methods are a fundamental tool in the analysis of large data sets. These algorithms work on the assumption that the "intrinsic dimension" of the data is generally much smaller than the ambient dimension in which it is collect
Externí odkaz:
http://arxiv.org/abs/1810.11562
A fundamental question in many data analysis settings is the problem of discerning the "natural" dimension of a data set. That is, when a data set is drawn from a manifold (possibly with noise), a meaningful aspect of the data is the dimension of tha
Externí odkaz:
http://arxiv.org/abs/1808.01686
Autor:
Adams, Henry, Aminian, Manuchehr, Farnell, Elin, Kirby, Michael, Peterson, Chris, Mirth, Joshua, Neville, Rachel, Shipman, Patrick, Shonkwiler, Clayton
Publikováno v:
In: Baas N., Carlsson G., Quick G., Szymik M., Thaule M. (eds), Topological Data Analysis. Abel Symposia, vol 15. Springer (2020)
We use persistent homology in order to define a family of fractal dimensions, denoted $\mathrm{dim}_{\mathrm{PH}}^i(\mu)$ for each homological dimension $i\ge 0$, assigned to a probability measure $\mu$ on a metric space. The case of $0$-dimensional
Externí odkaz:
http://arxiv.org/abs/1808.01079
Dimensionality-reduction techniques are a fundamental tool for extracting useful information from high-dimensional data sets. Because secant sets encode manifold geometry, they are a useful tool for designing meaningful data-reduction algorithms. In
Externí odkaz:
http://arxiv.org/abs/1807.03425
Endmember extraction plays a prominent role in a variety of data analysis problems as endmembers often correspond to data representing the purest or best representative of some feature. Identifying endmembers then can be useful for further identifica
Externí odkaz:
http://arxiv.org/abs/1807.01401