DropNet: Reducing Neural Network Complexity via Iterative Pruning
Autor: | Min, John Tan Chong, Motani, Mehul |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9356-9366, 2020 https://proceedings.mlr.press/v119/tan20a.html |
Druh dokumentu: | Working Paper |
DOI: | 10.5555/3524938.3525805 |
Popis: | Modern deep neural networks require a significant amount of computing time and power to train and deploy, which limits their usage on edge devices. Inspired by the iterative weight pruning in the Lottery Ticket Hypothesis, we propose DropNet, an iterative pruning method which prunes nodes/filters to reduce network complexity. DropNet iteratively removes nodes/filters with the lowest average post-activation value across all training samples. Empirically, we show that DropNet is robust across diverse scenarios, including MLPs and CNNs using the MNIST, CIFAR-10 and Tiny ImageNet datasets. We show that up to 90% of the nodes/filters can be removed without any significant loss of accuracy. The final pruned network performs well even with reinitialization of the weights and biases. DropNet also has similar accuracy to an oracle which greedily removes nodes/filters one at a time to minimise training loss, highlighting its effectiveness. Comment: Published at ICML 2020. Code can be found at https://github.com/tanchongmin/DropNet |
Databáze: | arXiv |
Externí odkaz: |