Improving Representation of Land‐use Maps Derived from Object‐oriented Image Classification

Weixiu Gao, Alfred Stein, Li Yang, Ying Wang, Hui Fang

Research output: Contribution to journalArticleAcademicpeer-review

3 Citations (Scopus)

Abstract

Object‐oriented (OO) image analysis provides an efficient way to generate vector‐format land‐cover and land‐use maps from remotely sensed images. Such image‐derived vector maps, however, are generally presented with congested and twisted polygons with step‐like boundaries. They include unclassified polygons and polygons with geometric conflicts such as unreadable small areas and narrow corridors. The complex and poorly readable representations usually make such maps not comply well with the Gestalt principle of cartography. This article describes a framework designed to improve the representation by resolving these problematic polygons. It presents a polygon similarity model integrating semantic, geometric and spectral characteristics of the image‐derived polygons to eliminate small and unclassified polygons. In addition, an outward‐inward‐buffering approach is presented to resolve the narrow‐corridor conflicts of a polygon and improve its overall appearance. A case study demonstrates that the implementation of the framework reduces the number of the polygons by 32% and the length of the polygon boundaries by 20%. At the same time, it does not cause distinct changes the distribution of land‐use types (less than 0.05%) and the overall accuracy (decreased only 0.02%) as compared with the original image‐derived land‐use maps. We conclude that the presented framework and models effectively improve the overall representation of image‐derived maps without distinct changes in their semantic characteristics and accuracy.
Original languageEnglish
Pages (from-to)387-405
JournalTransactions in GIS
Volume17
Issue number3
DOIs
Publication statusPublished - 2013

Fingerprint Dive into the research topics of 'Improving Representation of Land‐use Maps Derived from Object‐oriented Image Classification'. Together they form a unique fingerprint.

Cite this