Communication bounds for convolutional neural networks
Autor: | Anthony Chen, James Demmel, Grace Dinh, Mason Haberle, Olga Holtz |
---|---|
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Computational Complexity Computer Science - Distributed Parallel and Cluster Computing Computer Science - Data Structures and Algorithms Data Structures and Algorithms (cs.DS) Distributed Parallel and Cluster Computing (cs.DC) Computational Complexity (cs.CC) |
Zdroj: | Proceedings of the Platform for Advanced Scientific Computing Conference. |
Popis: | Convolutional neural networks (CNNs) are important in a wide variety of machine learning tasks and applications, so optimizing their performance is essential. Moving words of data between levels of a memory hierarchy or between processors on a network is much more expensive than the cost of arithmetic, so minimizing communication is critical to optimizing performance. In this paper, we present new lower bounds on data movement for mixed precision convolutions in both single-processor and parallel distributed memory models, as well as algorithms that outperform current implementations such as Im2Col. We obtain performance figures using GEMMINI, a machine learning accelerator, where our tiling provides improvements between 13% and 150% over a vendor supplied algorithm. |
Databáze: | OpenAIRE |
Externí odkaz: |