Fully Convolutional Networks for Multi-temporal SAR Image Classification

A.G. Mullissa, C. Persello, V.A. Tolpekin

Research output: Chapter in Book/Report/Conference proceedingChapterAcademicpeer-review

29 Citations (Scopus)
23 Downloads (Pure)

Abstract

Classification of crop types from multi-temporal SAR data is a complex task because of the need to extract spatial and temporal features from images affected by speckle. Previous methods applied speckle filtering and then classification in
two separate processing steps. This paper introduces fully convolutional networks (FCN) for pixel-wise classification of crops from multi-temporal SAR data. It applies speckle filtering and classification in a single framework. Furthermore, it also uses dilated kernels to increase the capability to learn long distance spatial dependencies. The proposed FCN was compared with a patch-based convolutional neural network (CNN) and support vector machine (SVM) classifiers. The proposed method performed better when compared with the patch-based CNN and SVM.
Original languageEnglish
Title of host publicationIGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium
PublisherIEEE
Pages6635-3338
Number of pages4
ISBN (Electronic)978-1-5386-7150-4
DOIs
Publication statusPublished - 5 Nov 2018
Event38th IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2018: Observing, Understanding and Forcasting the Dynamics of Our Planet - Feria Valencia Convention & Exhibition Center, Valencia, Spain
Duration: 22 Jul 201827 Jul 2018
Conference number: 38
https://www.igarss2018.org/

Conference

Conference38th IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2018
Abbreviated title2018
Country/TerritorySpain
CityValencia
Period22/07/1827/07/18
Internet address

Keywords

  • Deep learning
  • Fully convolutional networks
  • Remote Sensing
  • SAR
  • Sentinel-1

Fingerprint

Dive into the research topics of 'Fully Convolutional Networks for Multi-temporal SAR Image Classification'. Together they form a unique fingerprint.

Cite this