Zobrazeno 1 - 2
of 2
pro vyhledávání: '"Aswani, Himanshu Pradeep"'
Training CNNs from scratch on new domains typically demands large numbers of labeled images and computations, which is not suitable for low-power hardware. One way to reduce these requirements is to modularize the CNN architecture and freeze the weig
Externí odkaz:
http://arxiv.org/abs/2110.10969
Autor:
Aswani, Himanshu Pradeep, Sethi, Amit
Current research suggests that the key factors in designing neural network architectures involve choosing number of filters for every convolution layer, number of hidden neurons for every fully connected layer, dropout and pruning. The default activa
Externí odkaz:
http://arxiv.org/abs/2009.07793