Zobrazeno 1 - 10
of 153
pro vyhledávání: '"AMBITE, JOSÉ LUIS"'
In the realm of medical imaging, leveraging large-scale datasets from various institutions is crucial for developing precise deep learning models, yet privacy concerns frequently impede data sharing. federated learning (FL) emerges as a prominent sol
Externí odkaz:
http://arxiv.org/abs/2406.17235
Autor:
Stripelis, Dimitris, Anastasiou, Chrysovalantis, Toral, Patrick, Asghar, Armaghan, Ambite, Jose Luis
A Federated Learning (FL) system typically consists of two core processing entities: the federation controller and the learners. The controller is responsible for managing the execution of FL workflows across learners and the learners for training an
Externí odkaz:
http://arxiv.org/abs/2311.00334
Applying Multivariate Segmentation Methods to Human Activity Recognition From Wearable Sensors’ Data
Autor:
Li, Kenan, Habre, Rima, Deng, Huiyu, Urman, Robert, Morrison, John, Gilliland, Frank D, Ambite, José Luis, Stripelis, Dimitris, Chiang, Yao-Yi, Lin, Yijun, Bui, Alex AT, King, Christine, Hosseini, Anahita, Vliet, Eleanne Van, Sarrafzadeh, Majid, Eckel, Sandrah P
Publikováno v:
JMIR mHealth and uHealth, Vol 7, Iss 2, p e11201 (2019)
BackgroundTime-resolved quantification of physical activity can contribute to both personalized medicine and epidemiological research studies, for example, managing and identifying triggers of asthma exacerbations. A growing number of reportedly accu
Externí odkaz:
https://doaj.org/article/a47035c922a94672a72ed6632c7de107
Autor:
Stripelis, Dimitris, Ambite, Jose Luis
Federated Learning is a distributed machine learning approach that enables geographically distributed data silos to collaboratively learn a joint machine learning model without sharing data. Most of the existing work operates on unstructured data, su
Externí odkaz:
http://arxiv.org/abs/2305.08985
Autor:
Stripelis, Dimitris, Gupta, Umang, Dhinagar, Nikhil, Steeg, Greg Ver, Thompson, Paul, Ambite, José Luis
Federated training of large deep neural networks can often be restrictive due to the increasing costs of communicating the updates with increasing model sizes. Various model pruning techniques have been designed in centralized settings to reduce infe
Externí odkaz:
http://arxiv.org/abs/2208.11669
Autor:
Stripelis, Dimitris, Gupta, Umang, Saleem, Hamza, Dhinagar, Nikhil, Ghai, Tanmay, Anastasiou, Rafael Chrysovalantis, Asghar, Armaghan, Steeg, Greg Ver, Ravi, Srivatsan, Naveed, Muhammad, Thompson, Paul M., Ambite, Jose Luis
The amount of biomedical data continues to grow rapidly. However, collecting data from multiple sites for joint analysis remains challenging due to security, privacy, and regulatory concerns. To overcome this challenge, we use Federated Learning, whi
Externí odkaz:
http://arxiv.org/abs/2205.05249
Federated Learning has emerged as a dominant computational paradigm for distributed machine learning. Its unique data privacy properties allow us to collaboratively train models while offering participating clients certain privacy-preserving guarante
Externí odkaz:
http://arxiv.org/abs/2205.01184
To improve federated training of neural networks, we develop FedSparsify, a sparsification strategy based on progressive weight magnitude pruning. Our method has several benefits. First, since the size of the network becomes increasingly smaller, com
Externí odkaz:
http://arxiv.org/abs/2204.12430
We present an analysis of the performance of Federated Learning in a paradigmatic natural-language processing task: Named-Entity Recognition (NER). For our evaluation, we use the language-independent CoNLL-2003 dataset as our benchmark dataset and a
Externí odkaz:
http://arxiv.org/abs/2203.15101
Autor:
Stripelis, Dimitris, Gupta, Umang, Saleem, Hamza, Dhinagar, Nikhil, Ghai, Tanmay, Anastasiou, Chrysovalantis, Sánchez, Rafael, Steeg, Greg Ver, Ravi, Srivatsan, Naveed, Muhammad, Thompson, Paul M., Ambite, José Luis
Publikováno v:
In Patterns 9 August 2024 5(8)