Zobrazeno 1 - 10
of 1 363
pro vyhledávání: '"SHARMA, RISHI"'
Autor:
Allouah, Youssef, Dhasade, Akash, Guerraoui, Rachid, Gupta, Nirupam, Kermarrec, Anne-Marie, Pinot, Rafael, Pires, Rafael, Sharma, Rishi
Federated learning (FL) is an appealing approach to training machine learning models without sharing raw data. However, standard FL algorithms are iterative and thus induce a significant communication cost. One-shot federated learning (OFL) trades th
Externí odkaz:
http://arxiv.org/abs/2411.07182
Autor:
Biswas, Sayan, Kermarrec, Anne-Marie, Marouani, Alexis, Pires, Rafael, Sharma, Rishi, De Vos, Martijn
Decentralized learning (DL) is an emerging technique that allows nodes on the web to collaboratively train machine learning models without sharing raw data. Dealing with stragglers, i.e., nodes with slower compute or communication than others, is a k
Externí odkaz:
http://arxiv.org/abs/2410.12918
Decentralized learning (DL) is an emerging approach that enables nodes to collaboratively train a machine learning model without sharing raw data. In many application domains, such as healthcare, this approach faces challenges due to the high level o
Externí odkaz:
http://arxiv.org/abs/2410.02541
Publikováno v:
Phys. Rev. D 110, 096011 (2024)
We study the effect of a varying pion mass on the quantum chromodynamics (QCD) phase diagram in the presence of an external magnetic field, aiming to understand it, for the first time, using Nambu\textendash Jona-Lasinio like effective models. We com
Externí odkaz:
http://arxiv.org/abs/2407.14449
Autor:
Dhasade, Akash, Dini, Paolo, Guerra, Elia, Kermarrec, Anne-Marie, Miozzo, Marco, Pires, Rafael, Sharma, Rishi, de Vos, Martijn
Decentralized learning (DL) offers a powerful framework where nodes collaboratively train models without sharing raw data and without the coordination of a central server. In the iterative rounds of DL, models are trained locally, shared with neighbo
Externí odkaz:
http://arxiv.org/abs/2407.01283
Decentralized learning (DL) faces increased vulnerability to privacy breaches due to sophisticated attacks on machine learning (ML) models. Secure aggregation is a computationally efficient cryptographic technique that enables multiple parties to com
Externí odkaz:
http://arxiv.org/abs/2405.07708
Autor:
Biswas, Sayan, Even, Mathieu, Kermarrec, Anne-Marie, Massoulie, Laurent, Pires, Rafael, Sharma, Rishi, de Vos, Martijn
Decentralized learning (DL) enables collaborative learning without a server and without training data leaving the users' devices. However, the models shared in DL can still be used to infer training data. Conventional defenses such as differential pr
Externí odkaz:
http://arxiv.org/abs/2404.09536
Autor:
Biswas, Sayan, Frey, Davide, Gaudel, Romaric, Kermarrec, Anne-Marie, Lerévérend, Dimitri, Pires, Rafael, Sharma, Rishi, Taïani, François
This paper introduces ZIP-DL, a novel privacy-aware decentralized learning (DL) algorithm that exploits correlated noise to provide strong privacy protection against a local adversary while yielding efficient convergence guarantees for a low communic
Externí odkaz:
http://arxiv.org/abs/2403.11795
Autor:
de Vos, Martijn, Farhadkhani, Sadegh, Guerraoui, Rachid, Kermarrec, Anne-Marie, Pires, Rafael, Sharma, Rishi
We present Epidemic Learning (EL), a simple yet powerful decentralized learning (DL) algorithm that leverages changing communication topologies to achieve faster model convergence compared to conventional DL approaches. At each round of EL, each node
Externí odkaz:
http://arxiv.org/abs/2310.01972