Comparison of deep learning models in terms of multiple object detection on satellite images
Autor: | Ferdi Doğan, Ibrahim Turkoğlu |
---|---|
Rok vydání: | 2021 |
Předmět: | |
Zdroj: | Journal of Engineering Research. |
ISSN: | 2307-1885 2307-1877 |
DOI: | 10.36909/jer.12843 |
Popis: | The images obtained by remote sensing contain important data about ground surface. It is an important issue to detect objects on the ground surface with these images. Deep learning models are known to give better results in studies on object detection. However, the superiority of the deep learning models over each other is unknown. For this reason, it should be clarified which model is superior in terms of object detection and which model should be used in studies. In this study, it was aimed to reveal the superiorities of deep learning models by comparing their performance in detecting multiple objects. By using 11 deep learning models that are frequently encountered in the literature, the application of detecting objects of 14 classes in the DOTA dataset were made. 49,053 objects in 888 images were used for training by using AlexNet, Vgg16, Vgg19, GoogleNet, SequezeeNet, Resnet18, Resnet50, Resnet101, Inceptionresnetv2, inceptionv3, DenseNet201 models. After the training, 13,772 objects consisting of 14 classes in 277 images were used for testing with RCNN, which is one of the object detection methods. The performance of each algorithm in 14 classes has been demonstrated by using Average Precision (AP) and Mean Average Precision (mAP) to measure the performance of the models from their metrics. In a particular class of each deep learning model, difference in performance was observed The model with the highest performance varies in each class. In the application, the most successful average mAP value of 14 classes was Vgg16 with 24.64, while the lowest was InceptionResnetV2 with 11.78. In this article, the success of deep learning models in detecting multiple objects has been demonstrated practically and it is thought to be an important resource for researchers who will study on this subject. |
Databáze: | OpenAIRE |
Externí odkaz: |