Deep Fully Convolutional Networks for the Detection of Informal Settlements in VHR Images

Research output: Contribution to journalArticleAcademicpeer-review

141 Citations (Scopus)
109 Downloads (Pure)

Abstract

This letter investigates fully convolutional networks (FCNs) for the detection of informal settlements in very high resolution (VHR) satellite images. Informal settlements or slums are proliferating in developing countries and their detection and classification provides vital information for decision making and planning urban upgrading processes. Distinguishing different urban structures in VHR images is challenging because of the abstract semantic definition of the classes as opposed to the separation of standard land-cover classes. This task requires extraction of texture and spatial features. To this aim, we introduce deep FCNs to perform pixel-wise image labeling by automatically learning a higher level representation of the data. Deep FCNs can learn a hierarchy of features associated to increasing levels of abstraction, from raw pixel values to edges and corners up to complex spatial patterns. We present a deep FCN using dilated convolutions of increasing spatial support. It is capable of learning informative features capturing long-range pixel dependencies while keeping a limited number of network parameters. Experiments carried out on a Quickbird image acquired over the city of Dar es Salaam, Tanzania, show that the proposed FCN outperforms state-of-the-art convolutional networks. Moreover, the computational cost of the proposed technique is significantly lower than standard patch-based architectures.
Original languageEnglish
Pages (from-to)2325-2329
JournalIEEE geoscience and remote sensing letters
Volume14
Issue number12
DOIs
Publication statusPublished - 1 Jan 2017

Keywords

  • ITC-ISI-JOURNAL-ARTICLE
  • 2023 OA procedure

Fingerprint

Dive into the research topics of 'Deep Fully Convolutional Networks for the Detection of Informal Settlements in VHR Images'. Together they form a unique fingerprint.

Cite this