Popis: |
Deep learning & especially Convolutional Neural Networks (CNNs) are taking a different shape in the area of Image Recognition & Classification. Number of different CNN architectures are found in the literature. Performance of any CNN model depends on various parameters such as size of the dataset, number of classes, weights of the model, hypermeters and optimizers so on. Transfer learning or Fine tuning a pretrained model has become very common during present time because of its advantages. Several model optimization techniques have been discussed in the literature. Stochastic Gradient Descent (SGD), Adam & RMS Prop model optimizers are more commonly used in CNN model optimization.This paper focusses on the effect of above three optimizers on well-known CNN models namely, ResNet50 and InceptionV3. Above optimizers are used to train the Fine-tuned CNN models for 15 Epochs on cat vs dog dataset created by hand picking hundreds of images of cats & dogs from Kaggle cat vs dog dataset. For this experimentation, learning rate of 0.001 is used, Categorical Cross Entropy is used to calculate the training & validation loss. Comparative analysis is made between optimizers by plotting training loss vs epochs & training accuracy vs epochs. Results showed that, SGD optimizer outperforms other two for ResNet50. Training accuracy of approximately 99% is observed for ResNet50 with 500 training & 100 validation images. |