Land surface temperature (LST) plays a fundamental role in various geophysical processes at varying spatial and temporal scales. Satellite-based observations of LST provide a viable option for monitoring the spatial-temporal evolution of these processes. Downscaling is a widely adopted approach for solving the spatial-temporal trade-off associated with satellite-based observations of LST. However, despite the advances made in the field of LST downscaling, issues related to spatial averaging in the downscaling methodologies greatly hamper the utility of coarse-resolution thermal data for downscaling applications in complex environments. In this study, an improved LST downscaling approach based on random forest (RF) regression is presented. The proposed approach addresses issues related to spatial averaging biases associated with the downscaling model developed at the coarse resolution. The approach was applied to downscale the coarse-resolution Satellite Application Facility on Land Surface Analysis (LSA-SAF) LST product derived from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) sensor aboard the Meteosat Second Generation (MSG) weather satellite. The LSA-SAF product was downscaled to a spatial resolution of ~30 m, based on predictor variables derived from Sentinel 2, and the Advanced Land Observing Satellite (ALOS) digital elevation model (DEM). Quantitatively and qualitatively, better downscaling results were obtained using the proposed approach in comparison to the conventional approach of downscaling LST using RF widely adopted in LST downscaling studies. The enhanced performance indicates that the proposed approach has the ability to reduce the spatial averaging biases inherent in the LST downscaling methodology and thus is more suitable for downscaling applications in complex environments.
|Number of pages||23|
|Publication status||Published - 25 Oct 2020|
- Sentinel 2
- Random forest
- Spatial averaging biases