Central Object Segmentation by Deep Learning to Continuously Monitor Fruit Growth through RGB Images
Autor: | Shinya Yuki, Takashi Okuno, Motohisa Fukuda |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: |
Computer science
TP1-1185 Biochemistry Article Analytical Chemistry Segmentation Computer vision Electrical and Electronic Engineering Instrumentation image segmentation Pixel Artificial neural network business.industry Deep learning Chemical technology deep learning pear Image segmentation fruit Object (computer science) U-Net growth monitor Atomic and Molecular Physics and Optics Identification (information) RGB color model RGB images Artificial intelligence Neural Networks Computer central object business |
Zdroj: | Sensors Volume 21 Issue 21 Sensors, Vol 21, Iss 6999, p 6999 (2021) Sensors (Basel, Switzerland) |
ISSN: | 1424-8220 |
DOI: | 10.3390/s21216999 |
Popis: | Monitoring fruit growth is useful when estimating final yields in advance and predicting optimum harvest times. However, observing fruit all day at the farm via RGB images is not an easy task because the light conditions are constantly changing. In this paper, we present CROP (Central Roundish Object Painter). The method involves image segmentation by deep learning, and the architecture of the neural network is a deeper version of U-Net. CROP identifies different types of central roundish fruit in an RGB image in varied light conditions, and creates a corresponding mask. Counting the mask pixels gives the relative two-dimensional size of the fruit, and in this way, time-series images may provide a non-contact means of automatically monitoring fruit growth. Although our measurement unit is different from the traditional one (length), we believe that shape identification potentially provides more information. Interestingly, CROP can have a more general use, working even for some other roundish objects. For this reason, we hope that CROP and our methodology yield big data to promote scientific advancements in horticultural science and other fields. |
Databáze: | OpenAIRE |
Externí odkaz: |