Estimating conifer forest LAI with HyMap data using a reflectance model and artificial neural nets

M. Schlerf, C.G. Atzberger, S. Mader, T. Udelhoven

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

2 Downloads (Pure)


The potential of canopy reflectance modelling to retrieve structural variables in managed Norway spruce stands was investigated using the invertible forest reflectance model INFORM. INFORM was derived by coupling FLIM, SAIL and LIBERTY models, and was inverted with hyperspectral airborne HyMap data using a neural network approach. A relatively simple three layer feed-forward backpropagation neural network with two input neurons, one neuron in the hidden layer and three output neurons was employed. Leaf area index (LAI) field measurements from 39 forest stands were used to validate the LAI estimates produced from HyMap reflectances. Using two HyMap wavebands at 837 nm and 1148 nm the obtained accuracy of LAI amounts to an rmse of 0.58 (relative rmse 18 % of mean). In contrast to approaches based on empirical relations between a spectral vegetation index and the biophysical variable of interest, the inversion approach is applicable to various sensor types and site conditions.
Original languageEnglish
Title of host publicationProceedings of 4th EARSeL Workshop on Imaging Spectroscopy
Subtitle of host publicationNew quality in environmental studies, Warsaw, Poland, 26-29 April, 2005
EditorsB. Zagajewski, M. Sobczak
Place of PublicationWarsaw, Poland
PublisherEuropean Association of Remote Sensing Laboratories (EARSeL) and Warsaw University
Publication statusPublished - 2005
Event4th EARSeL Workshop on Imaging Spectroscopy 2005: New quality in environmental studies - Warsaw, Warsaw, Poland
Duration: 27 Apr 200530 Apr 2005
Conference number: 4


Workshop4th EARSeL Workshop on Imaging Spectroscopy 2005


  • NRS
  • ADLIB-ART-1292


Dive into the research topics of 'Estimating conifer forest LAI with HyMap data using a reflectance model and artificial neural nets'. Together they form a unique fingerprint.

Cite this