Automatic DenseNet Sparsification

Autor: Tao Li, Wencong Jiao, Li-Na Wang, Guoqiang Zhong
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Zdroj: IEEE Access, Vol 8, Pp 62561-62571 (2020)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2020.2984130
Popis: As a classic and well-performed deep convolutional neural network, DenseNet links every layer to each of its preceding layers via skip connections. However, the dense connectivity of the links leads to much redundance, consuming lots of computational resources. In this paper, to automatically prune redundant skip connections in DenseNet, we introduce a novel reinforcement learning method called automatic DenseNet sparsification (ADS). In ADS, we use adjacent matrix to represent dense connections in DenseNet, and design an agent using recurrent neural networks (RNNs) to sparsify the matrix, i. e. removing redundant skip connections in DenseNet. The validation accuracies of the sparsified DenseNets are used as rewards to update the agent, which promotes the agent to generate sparsified DenseNets with high performance. Extensive experiments demonstrate the effectiveness of ADS: The performance of the sparsified DenseNet surpasses not only the original DenseNet but related models; Moreover, the sparsified DenseNet has strong transferability when it is applied to new tasks. More importantly, ADS is very efficient. For the compression of a 40-layer DenseNet, it takes less than 1 day on a single GPU.
Databáze: Directory of Open Access Journals