Autor: |
Bishal Thapaliya, Riyasat Ohib, Eloy Geenjaar, Jingyu Liu, Vince Calhoun, Sergey M. Plis |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Frontiers in Neuroinformatics, Vol 18 (2024) |
Druh dokumentu: |
article |
ISSN: |
1662-5196 |
DOI: |
10.3389/fninf.2024.1430987 |
Popis: |
Recent advancements in neuroimaging have led to greater data sharing among the scientific community. However, institutions frequently maintain control over their data, citing concerns related to research culture, privacy, and accountability. This creates a demand for innovative tools capable of analyzing amalgamated datasets without the need to transfer actual data between entities. To address this challenge, we propose a decentralized sparse federated learning (FL) strategy. This approach emphasizes local training of sparse models to facilitate efficient communication within such frameworks. By capitalizing on model sparsity and selectively sharing parameters between client sites during the training phase, our method significantly lowers communication overheads. This advantage becomes increasingly pronounced when dealing with larger models and accommodating the diverse resource capabilities of various sites. We demonstrate the effectiveness of our approach through the application to the Adolescent Brain Cognitive Development (ABCD) dataset. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|