Compute, Time and Energy Characterization of Encoder-Decoder Networks with Automatic Mixed Precision Training

Autor: Samsi, Siddharth, Jones, Michael, Veillette, Mark M.
Rok vydání: 2020
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1109/HPEC43674.2020.9286241
Popis: Deep neural networks have shown great success in many diverse fields. The training of these networks can take significant amounts of time, compute and energy. As datasets get larger and models become more complex, the exploration of model architectures becomes prohibitive. In this paper we examine the compute, energy and time costs of training a UNet based deep neural network for the problem of predicting short term weather forecasts (called precipitation Nowcasting). By leveraging a combination of data distributed and mixed-precision training, we explore the design space for this problem. We also show that larger models with better performance come at a potentially incremental cost if appropriate optimizations are used. We show that it is possible to achieve a significant improvement in training time by leveraging mixed-precision training without sacrificing model performance. Additionally, we find that a 1549% increase in the number of trainable parameters for a network comes at a relatively smaller 63.22% increase in energy usage for a UNet with 4 encoding layers.
Comment: Accepted for publication at IEEE HPEC 2020
Databáze: arXiv