Pushing the limits of RNN Compression

Autor: Ganesh Dasika, Jesse Beu, Matthew Mattina, Chu Zhou, Dibakar Gope, Urmish Thakker, Igor Fedorov
Rok vydání: 2019
Předmět:
Zdroj: EMC2@NeurIPS
DOI: 10.1109/emc2-nips53020.2019.00012
Popis: Recurrent Neural Networks (RNN) can be difficult to deploy on resource constrained devices due to their size. As a result, there is a need for compression techniques that can significantly compress RNNs without negatively impacting task accuracy. This paper introduces a method to compress RNNs for resource constrained environments using Kronecker product (KP). KPs can compress RNN layers by 16-38x with minimal accuracy loss. We show that KP can beat the task accuracy achieved by other state-of-the-art compression techniques (pruning and low-rank matrix factorization) across 4 benchmarks spanning 3 different applications, while simultaneously improving inference run-time.
Comment: 6 pages. arXiv admin note: substantial text overlap with arXiv:1906.02876
Databáze: OpenAIRE