Aggregation Service for Federated Learning: An Efficient, Secure, and More Resilient Realization

Autor: Yifeng Zheng, Shangqi Lai, Yi Liu, Xingliang Yuan, Xun Yi, Cong Wang
Rok vydání: 2022
Předmět:
DOI: 10.48550/arxiv.2202.01971
Popis: Federated learning has recently emerged as a paradigm promising the benefits of harnessing rich data from diverse sources to train high quality models, with the salient features that training datasets never leave local devices. Only model updates are locally computed and shared for aggregation to produce a global model. While federated learning greatly alleviates the privacy concerns as opposed to learning with centralized data, sharing model updates still poses privacy risks. In this paper, we present a system design which offers efficient protection of individual model updates throughout the learning procedure, allowing clients to only provide obscured model updates while a cloud server can still perform the aggregation. Our federated learning system first departs from prior works by supporting lightweight encryption and aggregation, and resilience against drop-out clients with no impact on their participation in future rounds. Meanwhile, prior work largely overlooks bandwidth efficiency optimization in the ciphertext domain and the support of security against an actively adversarial cloud server, which we also fully explore in this paper and provide effective and efficient mechanisms. Extensive experiments over several benchmark datasets (MNIST, CIFAR-10, and CelebA) show our system achieves accuracy comparable to the plaintext baseline, with practical performance.
Comment: Accepted in IEEE Transactions on Dependable and Secure Computing (TDSC), 2022
Databáze: OpenAIRE