This study investigates the incorporation of open source data into a Bayesian classification of urban land use from very high resolution (VHR) stereo satellite images. The adopted classification framework starts from urban land cover classification, proceeds to building-type characterization, and results in urban land use. For urban land cover classification, a preliminary classification distinguishes trees, grass, and shadow objects using a random forest at a fine segmentation level. Fuzzy decision trees derived from hierarchical Bayesian models separate buildings from other man-made objects at a coarse segmentation level, where an open street map provides prior building information. A Bayesian network classifier combining commonly used land use indicators and spatial arrangement is used for the urban land use classification. The experiments were conducted on GeoEye stereo images over Oklahoma City, USA. Experimental results showed that the urban land use classification using VHR stereo images performed better than that using a monoscopic VHR image, and the integration of open source data improved the final urban land use classification. Our results also show a way of transferring the adopted urban land use classification framework, developed for a specific urban area in China, to other urban areas. The study concludes that incorporating open source data by Bayesian analysis improves urban land use classification. Moreover, a pretrained convolutional neural network fine tuned on the UC Merced land use dataset offers a useful tool to extract additional information for urban land use classification.
|Journal||IEEE Journal of selected topics in applied earth observations and remote sensing|
|Publication status||Published - 1 Nov 2017|
Li, M., De Beurs, K. M., Stein, A., & Bijker, W. (2017). Incorporating Open Source Data for Bayesian Classification of Urban Land Use From VHR Stereo Images. IEEE Journal of selected topics in applied earth observations and remote sensing, 10(11), 4930-4943. https://doi.org/10.1109/JSTARS.2017.2737702