Zobrazeno 1 - 10
of 10
pro vyhledávání: '"Marfoq, Othmane"'
Autor:
Mazziane, Younes Ben, Marfoq, Othmane
Count-Min Sketch with Conservative Updates (CMS-CU) is a memory-efficient hash-based data structure used to estimate the occurrences of items within a data stream. CMS-CU stores $m$ counters and employs $d$ hash functions to map items to these counte
Externí odkaz:
http://arxiv.org/abs/2405.12034
Autor:
Kaplan, Caelin G., Xu, Chuan, Marfoq, Othmane, Neglia, Giovanni, de Oliveira, Anderson Santana
Within the realm of privacy-preserving machine learning, empirical privacy defenses have been proposed as a solution to achieve satisfactory levels of training data privacy without a significant drop in model utility. Most existing defenses against m
Externí odkaz:
http://arxiv.org/abs/2310.12112
Autor:
Rodio, Angelo, Faticanti, Francescomaria, Marfoq, Othmane, Neglia, Giovanni, Leonardi, Emilio
The enormous amount of data produced by mobile and IoT devices has motivated the development of federated learning (FL), a framework allowing such devices (or clients) to collaboratively train machine learning models without sharing their local data.
Externí odkaz:
http://arxiv.org/abs/2301.04632
Federated learning (FL) is an effective solution to train machine learning models on the increasing amount of data generated by IoT devices and smartphones while keeping such data localized. Most previous work on federated learning assumes that clien
Externí odkaz:
http://arxiv.org/abs/2301.01542
Autor:
Terrail, Jean Ogier du, Ayed, Samy-Safwan, Cyffers, Edwige, Grimberg, Felix, He, Chaoyang, Loeb, Regis, Mangold, Paul, Marchand, Tanguy, Marfoq, Othmane, Mushtaq, Erum, Muzellec, Boris, Philippenko, Constantin, Silva, Santiago, Teleńczuk, Maria, Albarqouni, Shadi, Avestimehr, Salman, Bellet, Aurélien, Dieuleveut, Aymeric, Jaggi, Martin, Karimireddy, Sai Praneeth, Lorenzi, Marco, Neglia, Giovanni, Tommasi, Marc, Andreux, Mathieu
Federated Learning (FL) is a novel approach enabling several clients holding sensitive data to collaboratively train machine learning models, without centralizing data. The cross-silo FL setting corresponds to the case of few ($2$--$50$) reliable cli
Externí odkaz:
http://arxiv.org/abs/2210.04620
Federated learning allows clients to collaboratively learn statistical models while keeping their data local. Federated learning was originally used to train a unique global model to be served to all clients, but this approach might be sub-optimal wh
Externí odkaz:
http://arxiv.org/abs/2111.09360
The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models. First efforts in FL focused on learning a single g
Externí odkaz:
http://arxiv.org/abs/2108.10252
Federated learning usually employs a client-server architecture where an orchestrator iteratively aggregates model updates from remote clients and pushes them back a refined model. This approach may be inefficient in cross-silo settings, as close-by
Externí odkaz:
http://arxiv.org/abs/2010.12229
Autor:
Rodio, Angelo, Faticanti, Francescomaria, Marfoq, Othmane, Neglia, Giovanni, Leonardi, Emilio
Publikováno v:
IEEE/ACM Transactions on Networking; 2024, Vol. 32 Issue: 2 p1451-1460, 10p
Publikováno v:
ICML-39th International Conference on Machine Learning
ICML-39th International Conference on Machine Learning, Jul 2022, Baltimore (Maryland), United States
ICML-39th International Conference on Machine Learning, Jul 2022, Baltimore (Maryland), United States
Federated learning allows clients to collaboratively learn statistical models while keeping their data local. Federated learning was originally used to train a unique global model to be served to all clients, but this approach might be sub-optimal wh
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::9a4dc3105160d70f6ebb029b48ebf618
https://hal.science/hal-03697969
https://hal.science/hal-03697969