Zobrazeno 1 - 2
of 2
pro vyhledávání: '"Barkai, Saar"'
Cloud computing is becoming increasingly popular as a platform for distributed training of deep neural networks. Synchronous stochastic gradient descent (SSGD) suffers from substantial slowdowns due to stragglers if the environment is non-dedicated,
Externí odkaz:
http://arxiv.org/abs/1909.10802
Although distributed computing can significantly reduce the training time of deep neural networks, scaling the training process while maintaining high efficiency and final accuracy is challenging. Distributed asynchronous training enjoys near-linear
Externí odkaz:
http://arxiv.org/abs/1907.11612