Feature enhancement network for cloud removal in optical images by fusing with SAR images

Chenxi Duan, Mariana Belgiu*, Alfred Stein

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

32 Downloads (Pure)

Abstract

Presence of cloud-covered pixels is inevitable in optical remote-sensing images. Therefore, the reconstruction of the cloud-covered details is important to improve the usage of these images for subsequent image analysis tasks. Aiming to tackle the issue of high computational resource requirements that hinder the application at scale, this paper proposes a Feature Enhancement Network(FENet) for removing clouds in satellite images by fusing Synthetic Aperture Radar (SAR) and optical images. The proposed network consists of designed Feature Aggregation Residual Block (FAResblock) and Feature Enhancement Block (FEBlock). FENet is evaluated on the publicly available SEN12MS-CR dataset and it achieves promising results compared to the benchmark and the state-of-the-art methods in terms of both visual quality and quantitative evaluation metrics. It proved that the proposed feature enhancement network is an effective solution for satellite image cloud removal using less computational and time consumption. The proposed network has the potential for practical applications in the field of remote sensing due to its effectiveness and efficiency. The developed code and trained model will be available at https://github.com/chenxiduan/FENet.

Original languageEnglish
Pages (from-to)51-67
Number of pages17
JournalInternational journal of remote sensing
Volume45
Issue number1
Early online date27 Dec 2023
DOIs
Publication statusPublished - 2024

Keywords

  • cloud removal
  • data fusion
  • deep learning
  • remote sensing
  • SAR-optical
  • ITC-HYBRID
  • UT-Hybrid-D
  • ITC-ISI-JOURNAL-ARTICLE

Fingerprint

Dive into the research topics of 'Feature enhancement network for cloud removal in optical images by fusing with SAR images'. Together they form a unique fingerprint.

Cite this