Data Quality-Aware Client Selection in Heterogeneous Federated Learning
Autor: | Shinan Song, Yaxin Li, Jin Wan, Xianghua Fu, Jingyan Jiang |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2024 |
Předmět: | |
Zdroj: | Mathematics, Vol 12, Iss 20, p 3229 (2024) |
Druh dokumentu: | article |
ISSN: | 2227-7390 36991260 |
DOI: | 10.3390/math12203229 |
Popis: | Federated Learning (FL) enables decentralized data utilization while maintaining edge user privacy, but it faces challenges due to statistical heterogeneity. Existing approaches address client drift and data heterogeneity issues. However, real-world settings often involve low-quality data with noisy features, such as covariate drift or adversarial samples, which are usually ignored. Noisy samples significantly impact the global model’s accuracy and convergence rate. Assessing data quality and selectively aggregating updates from high-quality clients is crucial, but dynamically perceiving data quality without additional computations or data exchanges is challenging. In this paper, we introduce the FedDQA (Federated learning via Data Quality-Aware) (FedDQA) framework. We discover increased data noise leads to slower loss reduction during local model training. We propose a loss sharpness-based Data-Quality-Awareness (DQA) metric to differentiate between high-quality and low-quality data. Based on the DQA, we design a client selection algorithm that strategically selects participant clients to reduce the negative impact of noisy clients. Experiment results indicate that FedDQA significantly outperforms the baselines. Notably, it achieves up to a 4% increase in global model accuracy and demonstrates faster convergence rates. |
Databáze: | Directory of Open Access Journals |
Externí odkaz: | |
Nepřihlášeným uživatelům se plný text nezobrazuje | K zobrazení výsledku je třeba se přihlásit. |