Zobrazeno 1 - 10
of 44
pro vyhledávání: '"Charisopoulos, Vasileios"'
We study iterative signal reconstruction in computed tomography (CT), wherein measurements are produced by a linear transformation of the unknown signal followed by an exponential nonlinear map. Approaches based on pre-processing the data with a log
Externí odkaz:
http://arxiv.org/abs/2407.12984
Autor:
Melia, Owen, Tsang, Olivia, Charisopoulos, Vasileios, Khoo, Yuehaw, Hoskins, Jeremy, Willett, Rebecca
Interpreting scattered acoustic and electromagnetic wave patterns is a computational task that enables remote imaging in a number of important applications, including medical imaging, geophysical exploration, sonar and radar detection, and nondestruc
Externí odkaz:
http://arxiv.org/abs/2405.13214
In this paper, we study the stochastic linear bandit problem under the additional requirements of differential privacy, robustness and batched observations. In particular, we assume an adversary randomly chooses a constant fraction of the observed re
Externí odkaz:
http://arxiv.org/abs/2304.11741
Meeting growing demand for automotive battery resources is predicted to be costly from both economic and environmental perspectives. To minimize these costs, battery resources should be deployed as efficiently as possible. A potential source of ineff
Externí odkaz:
http://arxiv.org/abs/2304.10461
Autor:
Charisopoulos, Vasileios, Damle, Anil
We develop an eigenspace estimation algorithm for distributed environments with arbitrary node failures, where a subset of computing nodes can return structurally valid but otherwise arbitrarily chosen responses. Notably, this setting encompasses sev
Externí odkaz:
http://arxiv.org/abs/2206.00127
Autor:
Charisopoulos, Vasileios, Davis, Damek
Subgradient methods comprise a fundamental class of nonsmooth optimization algorithms. Classical results show that certain subgradient methods converge sublinearly for general Lipschitz convex functions and converge linearly for convex functions that
Externí odkaz:
http://arxiv.org/abs/2201.04611
Distributed computing is a standard way to scale up machine learning and data science algorithms to process large amounts of data. In such settings, avoiding communication amongst machines is paramount for achieving high performance. Rather than dist
Externí odkaz:
http://arxiv.org/abs/2009.02436
Several problems in machine learning, statistics, and other fields rely on computing eigenvectors. For large scale problems, the computation of these eigenvectors is typically performed via iterative schemes such as subspace iteration or Krylov metho
Externí odkaz:
http://arxiv.org/abs/2002.08491
Several fundamental tasks in data science rely on computing an extremal eigenspace of size $r \ll n$, where $n$ is the underlying problem dimension. For example, spectral clustering and PCA both require the computation of the leading $r$-dimensional
Externí odkaz:
http://arxiv.org/abs/1909.01188
Stochastic (sub)gradient methods require step size schedule tuning to perform well in practice. Classical tuning strategies decay the step size polynomially and lead to optimal sublinear rates on (strongly) convex problems. An alternative schedule, p
Externí odkaz:
http://arxiv.org/abs/1907.09547