Popis: |
This paper proposes a neuron drop-out mechanism to control the training paces of mobile devices in federated deep learning. The aim is to accelerate the speed of local training on slow mobile devices with minimal impact on training quality, such that slow mobile devices can catch up with fast devices in each training round to increase the overall training speed. The basic idea is to avoid the computation of some neurons with low activation values (i.e., neuron dropout), and dynamically adjust dropout rates based on the training progress on each mobile device. The paper introduces two techniques for selecting neurons, LSH and Max Heap, and a method for dynamically adjusting dropout rates. It also discusses a few other approaches that can be used to control training paces. |