Predicting statistics of asynchronous SGD parameters for a large-scale distributed deep learning system on GPU supercomputers
Autor: | Yosuke Oyama, Hiroki Nishimura, Satoshi Matsuoka, Akihiro Nomura, Ikuro Sato, Yukimasa Tamatsu |
---|---|
Rok vydání: | 2016 |
Předmět: |
Computer science
Stochastic process business.industry Deep learning 02 engineering and technology Machine learning computer.software_genre Generalization error Convolutional neural network Stochastic gradient descent Computer engineering Asynchronous communication 020204 information systems 0202 electrical engineering electronic engineering information engineering Probability distribution 020201 artificial intelligence & image processing Artificial intelligence business computer |
Zdroj: | IEEE BigData |
DOI: | 10.1109/bigdata.2016.7840590 |
Popis: | Many studies have shown that Deep Convolutional Neural Networks (DCNNs) exhibit great accuracies given large training datasets in image recognition tasks. Optimization technique known as asynchronous mini-batch Stochastic Gradient Descent (SGD) is widely used for deep learning because it gives fast training speed and good recognition accuracies, while it may increases generalization error if training parameters are in inappropriate ranges. We propose a performance model of a distributed DCNN training system called SPRINT that uses asynchronous GPU processing based on mini-batch SGD. The model considers the probability distribution of mini-batch size and gradient staleness that are the core parameters of asynchronous SGD training. Our performance model takes DCNN architecture and machine specifications as input parameters, and predicts time to sweep entire dataset, mini-batch size and staleness with 5%, 9% and 19% error in average respectively on several supercomputers with up to thousands of GPUs. Experimental results on two different supercomputers show that our model can steadily choose the fastest machine configuration that nearly meets a target mini-batch size. |
Databáze: | OpenAIRE |
Externí odkaz: |