A Distributed Computing Framework Based on Variance Reduction Method to Accelerate Training Machine Learning Models

Autor: Yuan Yuan, Jinyan Qiu, Yuxing Peng, Feng Liu, Hangjun Zhou, Dongsheng Li, Changjian Wang, Mingxing Tang, Zhen Huang
Rok vydání: 2020
Předmět:
Zdroj: 2020 IEEE International Conference on Joint Cloud Computing.
DOI: 10.1109/jcc49151.2020.00014
Popis: To support large-scale intelligent applications, distributed machine learning based on JointCloud is an intuitive solution scheme. However, the distributed machine learning is difficult to train due to that the corresponding optimization solver algorithms converge slowly, which highly demand on computing and memory resources. To overcome the challenges, we propose a computing framework for L-BFGS optimization algorithm based on variance reduction method, which can utilize a fixed big learning rate to linearly accelerate the convergence speed. To validate our claims, we have conducted several experiments on multiple classical datasets. Experimental results show that the computing framework accelerate the training process of solver and obtain accurate results for machine learning algorithms.
Databáze: OpenAIRE