Comparison of Single-Shot and Two-Shot Deep Neural Network Models for Whitefly Detection in IoT Web Application

Autor: Chinmay U. Parab, Canicius Mwitta, Miller Hayes, Jason M. Schmidt, David Riley, Kadeghe Fue, Suchendra Bhandarkar, Glen C. Rains
Jazyk: angličtina
Rok vydání: 2022
Předmět:
Zdroj: AgriEngineering, Vol 4, Iss 2, Pp 507-522 (2022)
Druh dokumentu: article
ISSN: 2624-7402
DOI: 10.3390/agriengineering4020034
Popis: In this study, we have compared YOLOv4, a single-shot detector to Faster-RCNN, a two-shot detector to detect and classify whiteflies on yellow-sticky tape (YST). An IoT remote whitefly monitoring station was developed and placed in a whitefly rearing room. Images of whiteflies attracted to the trap were recorded 2× per day. A total of 120 whitefly images were labeled using labeling software and split into a training and testing dataset, and 18 additional yellow-stick tape images were labeled with false positives to increase the model accuracy from remote whitefly monitors in the field that created false positives due to water beads and reflective light on the tape after rain. The two-shot detection model has two stages: region proposal and then classification of those regions and refinement of the location prediction. Single-shot detection skips the region proposal stage and yields final localization and content prediction at once. Because of this difference, YOLOv4 is faster but less accurate than Faster-RCNN. From the results of our study, it is clear that Faster-RCNN (precision—95.08%, F-1 Score—0.96, recall—98.69%) achieved a higher level of performance than YOLOv4 (precision—71.77%, F-1 score—0.83, recall—73.31%), and will be adopted for further development of the monitoring station.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje