Zobrazeno 1 - 10
of 130
pro vyhledávání: '"Jakovetić, Dušan"'
We study large deviations and mean-squared error (MSE) guarantees of a general framework of nonlinear stochastic gradient methods in the online setting, in the presence of heavy-tailed noise. Unlike existing works that rely on the closed form of a no
Externí odkaz:
http://arxiv.org/abs/2410.15637
Autor:
Armacki, Aleksandar, Yu, Shuhua, Sharma, Pranay, Joshi, Gauri, Bajovic, Dragana, Jakovetic, Dusan, Kar, Soummya
We study high-probability convergence in online learning, in the presence of heavy-tailed noise. To combat the heavy tails, a general framework of nonlinear SGD methods is considered, subsuming several popular nonlinearities like sign, quantization,
Externí odkaz:
http://arxiv.org/abs/2410.13954
We develop a family of distributed clustering algorithms that work over networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full,
Externí odkaz:
http://arxiv.org/abs/2402.01302
In this paper, we present an advanced approach to solving the inverse rig problem in blendshape animation, using high-quality corrective blendshapes. Our algorithm introduces novel enhancements in three key areas: ensuring high data fidelity in recon
Externí odkaz:
http://arxiv.org/abs/2401.16496
Motivated by localization problems such as cadastral maps refinements, we consider a generic Nonlinear Least Squares (NLS) problem of minimizing an aggregate squared fit across all nonlinear equations (measurements) with respect to the set of unknown
Externí odkaz:
http://arxiv.org/abs/2312.09064
Autor:
Armacki, Aleksandar, Sharma, Pranay, Joshi, Gauri, Bajovic, Dragana, Jakovetic, Dusan, Kar, Soummya
We study high-probability convergence guarantees of learning on streaming data in the presence of heavy-tailed noise. In the proposed scenario, the model is updated in an online fashion, as new information is observed, without storing any additional
Externí odkaz:
http://arxiv.org/abs/2310.18784
Motivated by understanding and analysis of large-scale machine learning under heavy-tailed gradient noise, we study distributed optimization with gradient clipping, i.e., in which certain clipping operators are applied to the gradients or gradient es
Externí odkaz:
http://arxiv.org/abs/2310.16920
We consider two formulations for distributed optimization wherein $N$ agents in a generic connected network solve a problem of common interest: distributed personalized optimization and consensus optimization. A new method termed DINAS (Distributed I
Externí odkaz:
http://arxiv.org/abs/2305.13985
The problem of rig inversion is central in facial animation as it allows for a realistic and appealing performance of avatars. With the increasing complexity of modern blendshape models, execution times increase beyond practically feasible solutions.
Externí odkaz:
http://arxiv.org/abs/2303.06370
We propose a new model-based algorithm solving the inverse rig problem in facial animation retargeting, exhibiting higher accuracy of the fit and sparser, more interpretable weight vector compared to SOTA. The proposed method targets a specific subdo
Externí odkaz:
http://arxiv.org/abs/2302.04843