Softer Pruning, Incremental Regularization
Autor: | Yongjun Xu, Zhulin An, Linhang Cai, Chuanguang Yang |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
business.industry Computer Science - Artificial Intelligence Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition Monotonic function 02 engineering and technology Filter (signal processing) 010501 environmental sciences Accuracy improvement 01 natural sciences Regularization (mathematics) Artificial Intelligence (cs.AI) Convergence (routing) 0202 electrical engineering electronic engineering information engineering Deep neural networks 020201 artificial intelligence & image processing Pruning (decision trees) Artificial intelligence business Algorithm 0105 earth and related environmental sciences Mathematics |
Zdroj: | ICPR |
DOI: | 10.48550/arxiv.2010.09498 |
Popis: | Network pruning is widely used to compress Deep Neural Networks (DNNs). The Soft Filter Pruning (SFP) method zeroizes the pruned filters during training while updating them in the next training epoch. Thus the trained information of the pruned filters is completely dropped. To utilize the trained pruned filters, we proposed a SofteR Filter Pruning (SRFP) method and its variant, Asymptotic SofteR Filter Pruning (ASRFP), simply decaying the pruned weights with a monotonic decreasing parameter. Our methods perform well across various networks, datasets and pruning rates, also transferable to weight pruning. On ILSVRC-2012, ASRFP prunes 40% of the parameters on ResNet-34 with 1.63% top-1 and 0.68% top-5 accuracy improvement. In theory, SRFP and ASRFP are an incremental regularization of the pruned filters. Besides, We note that SRFP and ASRFP pursue better results while slowing down the speed of convergence. Comment: 7 pages, ICPR2020 |
Databáze: | OpenAIRE |
Externí odkaz: |