HOTCAKE: Higher Order Tucker Articulated Kernels for Deeper CNN Compression
Autor: | Rui Lin, Zhuolun He, Ching-Yun Ko, Cong Chen, Hao Yu, Yuan Cheng, Graziano Chesi, Ngai Wong |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning Rank (linear algebra) Artificial neural network Computer science Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition Machine Learning (stat.ML) Convolutional neural network Machine Learning (cs.LG) Matrix decomposition Kernel (image processing) Statistics - Machine Learning Decomposition (computer science) Tensor Algorithm Tucker decomposition |
Popis: | The emerging edge computing has promoted immense interests in compacting a neural network without sacrificing much accuracy. In this regard, low-rank tensor decomposition constitutes a powerful tool to compress convolutional neural networks (CNNs) by decomposing the 4-way kernel tensor into multi-stage smaller ones. Building on top of Tucker-2 decomposition, we propose a generalized Higher Order Tucker Articulated Kernels (HOTCAKE) scheme comprising four steps: input channel decomposition, guided Tucker rank selection, higher order Tucker decomposition and fine-tuning. By subjecting each CONV layer to HOTCAKE, a highly compressed CNN model with graceful accuracy trade-off is obtained. Experiments show HOTCAKE can compress even pre-compressed models and produce state-of-the-art lightweight networks. 6 pages, 5 figures |
Databáze: | OpenAIRE |
Externí odkaz: |