Urban land use extraction from very high resolution remote sensing images by bayesian network

Mengmeng Li, A. Stein, W. Bijker

Research output: Chapter in Book/Report/Conference proceedingChapterAcademicpeer-review

3 Citations (Scopus)


This study aims to characterize the spatial arrangement of land cover features and integrate the spatial arrangement information with other commonly used land use indicators. The characterization is conducted at object level, corresponding to land cover objects. At the local urban level, a VHR image is dominated by buildings. Therefore, the characterization of spatial arrangement of land cover elements is mainly conducted on building objects. Since building objects have different functional properties in urban areas, we classify them into a set of different types according to their geometrical, morphological, and contextual properties. The spatial arrangement is characterized by considering the composition of different building types. Regarding the integration of land use indicators and spatial arrangement information, we construct a Bayesian network, in which the spatial arrangement is served as a latent variable, and the land use indicators calculated according to existing studies are treated as the nodes of this Bayesian network. This is followed by urban land use classification. We applied our proposed method to a subset of a Pleiades image for an urban area of Wuhan, China. We conclude that our proposed method can provide an effective means for urban land use extraction.
Original languageEnglish
Title of host publicationProceedings of International Geoscience and Remote Sensing Symposium (IGARSS) : Advancing the understanding of our living planet, 10-15 July 2016, Beijing, China
Place of PublicationPiscataway
ISBN (Print)978-1-5090-3332-4
Publication statusPublished - Jul 2016

Publication series



  • METIS-321637


Dive into the research topics of 'Urban land use extraction from very high resolution remote sensing images by bayesian network'. Together they form a unique fingerprint.

Cite this