Popis: |
The bagworm species of Metisa plana, is one of the major species of leaf-eating insect pest that attack oil palm in Peninsular Malaysia. Without any treatment, this situation may cause 43% yield loss from a moderate attack. In 2020, the economic loss due to bagworm attacks was recorded at around RM 180 million. Based on this scenario, it is necessary to closely monitor the bagworm outbreak at infested areas. Accuracy and precise data collection is debatable, due to human errors. . Hence, the objective of this study is to design and develop a specific machine vision that incorporates an image processing algorithm according to its functional modes. In this regard, a device, the Automated Bagworm Counter or Oto-BaCTM is the first in the world to be developed with an embedded software that is based on the use of a graphic processing unit computation and a TensorFlow/Teano library setup for the trained dataset. The technology is based on the developed deep learning with Faster Regions with Convolutional Neural Networks technique towards real time object detection. The Oto-BaCTM uses an ordinary camera. By using self-developed deep learning algorithms, a motion-tracking and false colour analysis were applied to detect and count number of living and dead larvae and pupae population per frond, respectively, corresponding to three major groups or sizes classification. Initially, in the first trial, the Oto-BaCTM has resulted in low percentages of detection accuracy for the living and dead G1 larvae (47.0% & 71.7%), G2 larvae (39.1 & 50.0%) and G3 pupae (30.1% & 20.9%). After some improvements on the training dataset, the percentages increased in the next field trial, with amounts of 40.5% and 7.0% for the living and dead G1 larvae, 40.1% and 29.2% for the living and dead G2 larvae and 47.7% and 54.6% for the living and dead pupae. The development of the ground-based device is the pioneer in the oil palm industry, in which it reduces human errors when conducting census while promoting precision agriculture practice. |