Vehicle Type Classification with Small Dataset and Transfer Learning Techniques

Autor: Quang-Tu Pham, Dinh-Dat Pham, Khanh-Ly Can, Hieu Dao To, Hoang-Dieu Vu
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, Vol 11, Iss 2 (2024)
Druh dokumentu: article
ISSN: 2410-0218
DOI: 10.4108/eetinis.v11i2.4678
Popis: This study delves into the application of deep learning training techniques using a restricted dataset, encompassing around 400 vehicle images sourced from Kaggle. Faced with the challenges of limited data, the impracticality of training models from scratch becomes apparent, advocating instead for the utilization of pre-trained models with pre-trained weights. The investigation considers three prominent models—EfficientNetB0, ResNetB0, and MobileNetV2—with EfficientNetB0 emerging as the most proficient choice. Employing the gradually unfreeze layer technique over a specified number of epochs, EfficientNetB0 exhibits remarkable accuracy, reaching 99.5% on the training dataset and 97% on the validation dataset. In contrast, training models from scratch results in notably lower accuracy. In this context, knowledge distillation proves pivotal, overcoming this limitation and significantly improving accuracy from 29.5% in training and 20.5% in validation to 54% and 45%, respectively. This study uniquely contributes by exploring transfer learning with gradually unfreeze layers and elucidates the potential of knowledge distillation. It highlights their effectiveness in robustly enhancing model performance under data scarcity, thus addressing challenges associated with training deep learning models on limited datasets. The findings underscore the practical significance of these techniques in achieving superior results when confronted with data constraints in real-world scenarios
Databáze: Directory of Open Access Journals