On Joint Noise Scaling in Differentially Private Federated Learning with Multiple Local Steps

Autor: Heikkilä, Mikko A.
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Federated learning is a distributed learning setting where the main aim is to train machine learning models without having to share raw data but only what is required for learning. To guarantee training data privacy and high-utility models, differential privacy and secure aggregation techniques are often combined with federated learning. However, with fine-grained protection granularities the currently existing techniques require the parties to communicate for each local optimisation step, if they want to fully benefit from the secure aggregation in terms of the resulting formal privacy guarantees. In this paper, we show how a simple new analysis allows the parties to perform multiple local optimisation steps while still benefiting from joint noise scaling when using secure aggregation. We show that our analysis enables higher utility models with guaranteed privacy protection under limited number of communication rounds.
Comment: 14 pages with appendix, 3 figures, 1 table
Databáze: arXiv