The ability to monitor crop health, condition, and productivity is critical when combatting agricultural shortages and food insecurity. An illustrative example of this was seen in the food price hikes during the post-2008 financial crisis - a situation severe enough to cause the G20 to convene and create an organization called GEOGLAM to help monitor and prevent future disasters.
The global monitoring of agricultural land is a labor and resource intensive task, however, requiring the ability to both observe the entire globe and analyze the observations. Agricultural-focused researchers have often turned to remote sensing to meet these demands due to the capability of land-observing satellites to monitor the entirety of the globe with relative ease. NASA Harvest, for example, has used remote sensing methodologies to predict crop growth stages, monitor the impact of droughts, and quantify agricultural market uncertainties.
There are challenges though with current crop monitoring methods, specifically persistent cloud cover (especially in tropical locations) and the inability of optical imagery to pierce clouds. Optical imagery is composed of light in the portion of the electromagnetic spectrum visible to human eyes and has a long history of use in land observation research. Unfortunately though, the wavelengths that compose optical light are unable to pass through cloudy skies, decreasing its usefulness in particularly cloudy environments.
Fortunately, there are alternatives to optical imagery that are able to pierce through clouds, and one readily-available option is radar. Synthetic aperture radar (SAR) imagery thus provides a solution to monitoring fields otherwise hidden from the view of optical imagery. By integrating SAR with traditional optical methods, researchers are able to pierce the cloudy veil and more effectively analyze ground conditions.
Graphical Abstract from Hosseini et al. 2020
NASA Harvest partner Mehdi Hosseini, alongside Heather McNairn [Agriculture and Agri-Food Canada], Scott Mitchell [Carleton University], Laura Dingle Robertson [Agriculture and Agri-Food Canada], Andrew Davidson [Agriculture and Agri-Food Canada], and Saeid Homayouni [Centre Eau Terre Environnement] recently developed a methodology for integrating SAR and optical data that allows users to monitor corn development with biomass products derived from SAR imagery. Implementing a transfer function developed using a Neural Network, the team related the amount of corn biomass estimated using SAR imagery to the amount estimated using optical imagery. With this relationship known, researchers can now utilize SAR imagery to observe corn development through cloud cover and compare those results to ones obtained by the more traditional optical methodologies. This allows for greater temporal observation capabilities, widening the data available for analysis.
The team’s results were published as Integration of synthetic aperture radar and optical satellite data for corn biomass estimation in the journal MethodsX and can be viewed here.