An Approach to Tree Detection Based on the Fusion of Multitemporal LiDAR Data

Daniele Marinelli, Claudia Paris, Lorenzo Bruzzone*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

9 Citations (Scopus)


The repetitive acquisition of airborne light detection and ranging (LiDAR) data for forest surveys is rapidly increasing, thus making possible the forest dynamic analysis. Moreover, the availability of multitemporal data enables the possibility to improve the forest attribute estimates performed at single date, especially when one LiDAR acquisition has a lower pulse density with respect to the other. This letter presents a novel approach that exploits the bitemporal data information to: 1) improve the tree detection at both dates and 2) identify forest changes at single tree level. This is done by using a novel compound approach to the detection of trees in bitemporal data based on the Bayes rule for minimum error. Significant geometric features are extracted for each candidate tree-top and are used to estimate statistical terms employed in the compound approach. The multitemporal information is considered by estimating (in an iterative way) the probabilities of transition, which takes into account the temporal dependence between the LiDAR acquisitions. The proposed approach is evaluated on multitemporal LiDAR data acquired in a coniferous forest located in the Southern Italian Alps. Experimental results confirm the effectiveness of the compound detection that increases the overall accuracy (OA) up to 8.6% with respect to the single-date detection.
Original languageEnglish
Article number8698891
Pages (from-to)1771-1775
Number of pages5
JournalIEEE geoscience and remote sensing letters
Issue number11
Early online date25 Apr 2019
Publication statusPublished - Nov 2019
Externally publishedYes


  • ITC-CV


Dive into the research topics of 'An Approach to Tree Detection Based on the Fusion of Multitemporal LiDAR Data'. Together they form a unique fingerprint.

Cite this