Zobrazeno 1 - 10
of 1 371
pro vyhledávání: '"Loizou P"'
Autor:
Oikonomou, Dimitris, Loizou, Nicolas
Stochastic gradient descent with momentum, also known as Stochastic Heavy Ball method (SHB), is one of the most popular algorithms for solving large-scale stochastic optimization problems in various machine learning tasks. In practical scenarios, tun
Externí odkaz:
http://arxiv.org/abs/2406.04142
The performance of Automated Recognition (ATR) algorithms on side-scan sonar imagery has shown to degrade rapidly when deployed on non benign environments. Complex seafloors and acoustic artefacts constitute distractors in the form of strong textural
Externí odkaz:
http://arxiv.org/abs/2404.18663
Gradient Descent Ascent (GDA) methods for min-max optimization problems typically produce oscillatory behavior that can lead to instability, e.g., in bilinear settings. To address this problem, we introduce a dissipation term into the GDA updates to
Externí odkaz:
http://arxiv.org/abs/2403.09090
The Stochastic Extragradient (SEG) method is one of the most popular algorithms for solving finite-sum min-max optimization and variational inequality problems (VIPs) appearing in various machine learning tasks. However, existing convergence analyses
Externí odkaz:
http://arxiv.org/abs/2403.07148
Autor:
Choudhury, Sayantan, Tupitsa, Nazarii, Loizou, Nicolas, Horvath, Samuel, Takac, Martin, Gorbunov, Eduard
Adaptive methods are extremely popular in machine learning as they make learning rate tuning less expensive. This paper introduces a novel optimization algorithm named KATE, which presents a scale-invariant adaptation of the well-known AdaGrad algori
Externí odkaz:
http://arxiv.org/abs/2403.02648
We introduce FacadeNet, a deep learning approach for synthesizing building facade images from diverse viewpoints. Our method employs a conditional GAN, taking a single view of a facade along with the desired viewpoint information and generates an ima
Externí odkaz:
http://arxiv.org/abs/2311.01240
Federated learning is a paradigm of distributed machine learning in which multiple clients coordinate with a central server to learn a model, without sharing their own training data. Standard federated optimization methods such as Federated Averaging
Externí odkaz:
http://arxiv.org/abs/2307.06306
Distributed and federated learning algorithms and techniques associated primarily with minimization problems. However, with the increase of minimax optimization and variational inequality problems in machine learning, the necessity of designing effic
Externí odkaz:
http://arxiv.org/abs/2306.05100
Autor:
Zia, Aneeq, Bhattacharyya, Kiran, Liu, Xi, Berniker, Max, Wang, Ziheng, Nespolo, Rogerio, Kondo, Satoshi, Kasai, Satoshi, Hirasawa, Kousuke, Liu, Bo, Austin, David, Wang, Yiheng, Futrega, Michal, Puget, Jean-Francois, Li, Zhenqiang, Sato, Yoichi, Fujii, Ryo, Hachiuma, Ryo, Masuda, Mana, Saito, Hideo, Wang, An, Xu, Mengya, Islam, Mobarakol, Bai, Long, Pang, Winnie, Ren, Hongliang, Nwoye, Chinedu, Sestini, Luca, Padoy, Nicolas, Nielsen, Maximilian, Schüttler, Samuel, Sentker, Thilo, Husseini, Hümeyra, Baltruschat, Ivo, Schmitz, Rüdiger, Werner, René, Matsun, Aleksandr, Farooq, Mugariya, Saaed, Numan, Viera, Jose Renato Restom, Yaqub, Mohammad, Getty, Neil, Xia, Fangfang, Zhao, Zixuan, Duan, Xiaotian, Yao, Xing, Lou, Ange, Yang, Hao, Han, Jintong, Noble, Jack, Wu, Jie Ying, Alshirbaji, Tamer Abdulbaki, Jalal, Nour Aldeen, Arabian, Herag, Ding, Ning, Moeller, Knut, Chen, Weiliang, He, Quan, Bilal, Muhammad, Akinosho, Taofeek, Qayyum, Adnan, Caputo, Massimo, Vohra, Hunaid, Loizou, Michael, Ajayi, Anuoluwapo, Berrou, Ilhem, Niyi-Odumosu, Faatihah, Maier-Hein, Lena, Stoyanov, Danail, Speidel, Stefanie, Jarc, Anthony
The ability to automatically detect and track surgical instruments in endoscopic videos can enable transformational interventions. Assessing surgical performance and efficiency, identifying skilled tool use and choreography, and planning operational
Externí odkaz:
http://arxiv.org/abs/2305.07152
Single-call stochastic extragradient methods, like stochastic past extragradient (SPEG) and stochastic optimistic gradient (SOG), have gained a lot of interest in recent years and are one of the most efficient algorithms for solving large-scale min-m
Externí odkaz:
http://arxiv.org/abs/2302.14043