Multi-source hierarchical conditional random field model for feature fusion of remote sensing images and LiDAR data

Z. Zhang, M. Y. Yang, M. Zhoua

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)


Feature fusion of remote sensing images and LiDAR points cloud data, which have strong complementarity, can effectively play the advantages of multi-class features to provide more reliable information support for the remote sensing applications, such as object classification and recognition. In this paper, we introduce a novel multi-source hierarchical conditional random field (MSHCRF) model to fuse features extracted from remote sensing images and LiDAR data for image classification. Firstly, typical features are selected to obtain the interest regions from multi-source data, then MSHCRF model is constructed to exploit up the features, category compatibility of images and the category consistency of multi-source data based on the regions, and the outputs of the model represents the optimal results of the image classification. Competitive results demonstrate the precision and robustness of the proposed method.

Original languageEnglish
Title of host publicationISPRS Hannover Workshop 2013 (Volume XL-1/W1)
Subtitle of host publicationWG I/4, III/4, IC IV/VIII, VII/2
EditorsC. Heipke, K. Jacobsen, F. Rottensteiner, U. Sörgel
PublisherInternational Society for Photogrammetry and Remote Sensing (ISPRS)
Number of pages4
Publication statusPublished - 1 Jan 2013
EventISPRS Hannover Workshop 2013 - Hannover, Germany
Duration: 21 May 201324 May 2013

Publication series

NameInternational Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
ISSN (Print)1682-1750


ConferenceISPRS Hannover Workshop 2013


  • Conditional random field
  • Feature fusion
  • Hierarchical model
  • Image classification
  • Multi-source data


Dive into the research topics of 'Multi-source hierarchical conditional random field model for feature fusion of remote sensing images and LiDAR data'. Together they form a unique fingerprint.

Cite this