Popis: |
The value of satellite imagery to monitor crop health in near-real time continues to exponentially grow as more missions are launched making data available at higher spatial and temporal scales. Yet cloud cover remains an issue for utilizing vegetation indexes (VIs) solely based on optic imagery, especially in certain regions and climates. Previous research has proven the ability to reconstruct VIs like the Normalized Difference Vegetation Index (NDVI) and Leaf Area Index (LAI) by leveraging synthetic aperture radar (SAR) datasets, which are not inhibited by cloud cover. Publicly available data from SAR missions like Sentinel-1 at relatively decent spatial resolutions present the opportunity for more affordable options for agriculture users to integrate satellite imagery in their day to day operations. Previous research has successfully reconstructed optic VIs (i.e. from Sentinel-2) with SAR data (i.e. from Sentinel-1) leveraging various machine learning approaches for a limited number of crop types. However, these efforts normally train on individual pixels rather than leveraging information at a field level. Here we present Beyond Cloud, a product which is the first to leverage computer vision and machine learning approaches in order to provide fused optic and SAR based crop health information. Field level learning is especially well-suited for inherently noisy SAR datasets. Several use cases are presented over agriculture fields located throughout the United Kingdom, France and Belgium, where cloud cover limits optic based solutions to as little as 2-3 images per growing season. Preliminary efforts for additional features to the product including automated crop and soil type detection are also discussed. Beyond Cloud can be accessed via a simple API which makes integration of the results easy for existing dashboards and smart-ag tools. Overall, these efforts promote the accessibility of satellite imagery for real agriculture end users. |