FedFNN: Faster Training Convergence Through Update Predictions in Federated Recommender Systems

Autor: Fabbri, Francesco, Liu, Xianghang, McKenzie, Jack R., Twardowski, Bartlomiej, Wijaya, Tri Kurniawan
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: Federated Learning (FL) has emerged as a key approach for distributed machine learning, enhancing online personalization while ensuring user data privacy. Instead of sending private data to a central server as in traditional approaches, FL decentralizes computations: devices train locally and share updates with a global server. A primary challenge in this setting is achieving fast and accurate model training - vital for recommendation systems where delays can compromise user engagement. This paper introduces FedFNN, an algorithm that accelerates decentralized model training. In FL, only a subset of users are involved in each training epoch. FedFNN employs supervised learning to predict weight updates from unsampled users, using updates from the sampled set. Our evaluations, using real and synthetic data, show: 1. FedFNN achieves training speeds 5x faster than leading methods, maintaining or improving accuracy; 2. the algorithm's performance is consistent regardless of client cluster variations; 3. FedFNN outperforms other methods in scenarios with limited client availability, converging more quickly.
Databáze: arXiv