Mapping plant traits in agricultural fields based on time series of hyperspectral images from Unmanned Aerial Vehicles (UAV)

G. Ntakos, C. van der Tol (Contributor), Tamme van der Wal (Contributor)

Research output: Contribution to conferenceOtherAcademic

Abstract

Hyperspectral cameras, mounted on unmanned aerial vehicles provide information with high spatial, temporal and spectral resolution. Such detailed information allows us to monitor various plant traits which is essential for precision agriculture applications and changes the game of remote sensing which was, until recently, focused on multispectral imagery. In particular, the higher spectral resolution has the potential to provide details for the plant health status that were not distinguishable before. The added value of hyperspectral remote sensing may contribute to exploring the spatial and temporal development of various plant traits in the field. This can help farmers to redefine the management zones and agricultural practices throughout the cultivation period.

During the summer of 2018, a time series of images from a potato field in the Netherlands was acquired, using a fixed-wing drone and a hyperspectral camera, made by the Swiss company Gamaya, able to measure reflectance in 40 bands in the range from 450nm to 900nm. The objective of the study was to create maps of key plant traits by using the lookup table (LUT) approach and a radiative transfer model (RTM).

In order to narrow the range of variables for the LUT simulations, we first estimated the possible ranges of the vegetation parameters from 50 randomly selected pixels by using model inversion. By using the RTM SCOPE, we obtained the minimum and maximum of the parameters of the pixels from the study area. We thereafter generated a LUT of approximately 100,000 simulations for each image with the values of the input parameters within these minimums and maximums using a normal distribution sampling approach. Finally, we compared the reflectance measurements from the images to the LUT and found the best LUT simulation for each measurement, according to root mean square error (RMSE). In this way, the key vegetation parameters, such as Chlorophyll a+b concentration (Cab), Carotenoid content (Cca), Dry matter content (Cdm), Leaf area index (LAI), were estimated for the images.

The results show that the size of the LUT is a key factor to achieve a low difference between the measured and simulated TOC reflectance. However, estimating plant trait values by assigning LUT simulations is a significantly less time-consuming approach than using numerical optimization for each individual image pixel. In our case study, we achieved an average RMSE of 0.13 with a standard deviation of 0.05 by selecting the “best” simulation for each pixel but, as a future work, it could be interesting to investigate whether selecting a bigger number of simulations to assign to the image pixels can further improve our results. Furthermore, when using RTMs, it is important to always take into consideration the potential ill-posedness of the retrieval, where different combinations of the model’s input parameters can produce the same TOC reflectance. This problem can be amplified due to measurement errors like the fast-changing lighting conditions at the time of data acquisition.

* The project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Sklodowska-Curie grant agreement No 721995.
Original languageEnglish
Number of pages1
Publication statusPublished - 24 Sep 2019
Event10th Workshop on Hyperspectral Imaging and Signal Processing, WHISPERS 2019: Evolution in Remote Sensing - Beurs van Berlage, Amsterdam, Netherlands
Duration: 24 Sep 201926 Sep 2019

Conference

Conference10th Workshop on Hyperspectral Imaging and Signal Processing, WHISPERS 2019
Abbreviated titleWHISPERS 2019
CountryNetherlands
CityAmsterdam
Period24/09/1926/09/19

Keywords

  • Remote Sensing
  • Hyperspectral
  • Agriculture
  • UAV images
  • Plant traits
  • SCOPE model
  • Radiative transfer modelling (RTM)

Cite this