A Novel Distributed Variant of Stochastic Gradient Descent and Its Optimization

Autor: Jianping Yin, Yi-qi Wang, Zhan Shi, Yawei Zhao
Rok vydání: 2017
Předmět:
Zdroj: DEStech Transactions on Computer Science and Engineering.
ISSN: 2475-8841
Popis: In the age of big data, large-scale learning problems become increasingly significant. Distributed machine learning algorithms thus draw a lot of interest, particularly those based on Stochastic Gradient Descent (SGD) with variance reduction technique. In this paper, we propose and implement a distributed programming strategy for a newly developed variance-reducing SGDbased algorithm, and analyze its performance with various parameter. Moreover, a new SGD-based algorithm named BATCHVR is introduced, which computes the full-gradients required by SGD in each stage using batches in an incremental manner. Experiments on the HPC cluster, i.e. TH-1A demonstrate the effectiveness of the distributed strategy and the excellent performance of the proposed algorithm.
Databáze: OpenAIRE