A distributed computing framework based on lightweight variance reduction method to accelerate machine learning training on blockchain
Autor: | Zhen Huang, Feng Liu, Yuxing Peng, Jinyan Qiu, Mingxing Tang |
---|---|
Rok vydání: | 2020 |
Předmět: |
Scheme (programming language)
Blockchain Optimization algorithm Computer Networks and Communications Computer science business.industry Distributed computing Training (meteorology) Process (computing) Mode (statistics) Solver Machine learning computer.software_genre Variance reduction Artificial intelligence Electrical and Electronic Engineering business computer computer.programming_language |
Zdroj: | China Communications. 17:77-89 |
ISSN: | 1673-5447 |
DOI: | 10.23919/jcc.2020.09.007 |
Popis: | To security support large-scale intelligent applications, distributed machine learning based on blockchain is an intuitive solution scheme. However, the distributed machine learning is difficult to train due to that the corresponding optimization solver algorithms converge slowly, which highly demand on computing and memory resources. To overcome the challenges, we propose a distributed computing framework for L-BFGS optimization algorithm based on variance reduction method, which is a lightweight, few additional cost and parallelized scheme for the model training process. To validate the claims, we have conducted several experiments on multiple classical datasets. Results show that our proposed computing framework can steadily accelerate the training process of solver in either local mode or distributed mode. |
Databáze: | OpenAIRE |
Externí odkaz: |