Fusion of multispectral data through illumination-aware deep neural networks for pedestrian detection

Dayan Guan, Yanpeng Cao* (Corresponding Author), Jiangxin Yang, Yanlong Cao, Michael Ying Yang

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

11 Citations (Scopus)

Abstract

Multispectral pedestrian detection has received extensive attention in recent years as a promising solution to facilitate robust human target detection for around-the-clock applications (e.g., security surveillance and autonomous driving). In this paper, we demonstrate illumination information encoded in multispectral images can be utilized to boost the performance of pedestrian detection significantly. A novel illumination-aware weighting mechanism is present to depict illumination condition of a scene accurately. Such illumination information is incorporated into two-stream deep convolutional neural networks to learn multispectral human-related features under different illumination conditions (daytime and nighttime). Moreover, we utilized illumination information together with multispectral data to generate more accurate semantic segmentation which is used to supervise the training of pedestrian detector. Putting all of the pieces together, we present an effective framework for multispectral pedestrian detection based on multi-task learning of illumination-aware pedestrian detection and semantic segmentation. Our proposed method is trained end-to-end using a well-designed multi-task loss function and outperforms state-of-the-art approaches on KAIST multispectral pedestrian dataset.
Original languageEnglish
Pages (from-to)148-157
Number of pages10
JournalInformation Fusion
Volume50
Early online date29 Nov 2018
DOIs
Publication statusPublished - 1 Oct 2019

Keywords

  • ITC-ISI-JOURNAL-ARTICLE

Fingerprint Dive into the research topics of 'Fusion of multispectral data through illumination-aware deep neural networks for pedestrian detection'. Together they form a unique fingerprint.

  • Cite this