A federated learning scheme meets dynamic differential privacy

Autor: Shengnan Guo, Xibin Wang, Shigong Long, Hai Liu, Liu Hai, Toong Hai Sam
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: CAAI Transactions on Intelligence Technology, Vol 8, Iss 3, Pp 1087-1100 (2023)
Druh dokumentu: article
ISSN: 2468-2322
DOI: 10.1049/cit2.12187
Popis: Abstract Federated learning is a widely used distributed learning approach in recent years, however, despite model training from collecting data become to gathering parameters, privacy violations may occur when publishing and sharing models. A dynamic approach is proposed to add Gaussian noise more effectively and apply differential privacy to federal deep learning. Concretely, it is abandoning the traditional way of equally distributing the privacy budget ϵ and adjusting the privacy budget to accommodate gradient descent federation learning dynamically, where the parameters depend on computation derived to avoid the impact on the algorithm that hyperparameters are created manually. It also incorporates adaptive threshold cropping to control the sensitivity, and finally, moments accountant is used to counting the ϵ consumed on the privacy‐preserving, and learning is stopped only if the ϵtotal by clients setting is reached, this allows the privacy budget to be adequately explored for model training. The experimental results on real datasets show that the method training has almost the same effect as the model learning of non‐privacy, which is significantly better than the differential privacy method used by TensorFlow.
Databáze: Directory of Open Access Journals