Improving optimization of convolutional neural networks through parameter fine-tuning
Autor: | Scott Nykl, Kenneth M. Hopkinson, Nicholas C. Becherer, John M. Pecarina |
---|---|
Rok vydání: | 2017 |
Předmět: |
0209 industrial biotechnology
Fine-tuning Artificial neural network Contextual image classification Computer science business.industry Deep learning Initialization 02 engineering and technology Machine learning computer.software_genre Convolutional neural network Domain (software engineering) 020901 industrial engineering & automation Artificial Intelligence 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Artificial intelligence Transfer of learning business computer Software |
Zdroj: | Neural Computing and Applications. 31:3469-3479 |
ISSN: | 1433-3058 0941-0643 |
Popis: | In recent years, convolutional neural networks have achieved state-of-the-art performance in a number of computer vision problems such as image classification. Prior research has shown that a transfer learning technique known as parameter fine-tuning wherein a network is pre-trained on a different dataset can boost the performance of these networks. However, the topic of identifying the best source dataset and learning strategy for a given target domain is largely unexplored. Thus, this research presents and evaluates various transfer learning methods for fine-grained image classification as well as the effect on ensemble networks. The results clearly demonstrate the effectiveness of parameter fine-tuning over random initialization. We find that training should not be reduced after transferring weights, larger, more similar networks tend to be the best source task, and parameter fine-tuning can often outperform randomly initialized ensembles. The experimental framework and findings will help to train models with improved accuracy. |
Databáze: | OpenAIRE |
Externí odkaz: |