Zobrazeno 1 - 1
of 1
pro vyhledávání: '"544"'
Publikováno v:
Entropy, Vol 22, Iss 544, p 544 (2020)
Entropy
Volume 22
Issue 5
Entropy
Volume 22
Issue 5
When gradient descent (GD) is scaled to many parallel workers for large scale machine learning problems, its per-iteration computation time is limited by the straggling workers. Straggling workers can be tolerated by assigning redundant computations