CABiNet: Efficient Context Aggregation Network for Low-Latency Semantic Segmentation

Saumya Kumaar, Ye Lyu, F. Nex, Michael Ying Yang

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

With the increasing demand of autonomous machines, pixel-wise semantic segmentation for visual scene understanding needs to be not only accurate but also efficient for any potential real-time applications. In this paper, we propose CABiNet (Context Aggregated Bi-lateral Network), a dual branch convolutional neural network (CNN), with significantly lower computational costs as compared to the state-of-the-art, while maintaining a competitive prediction accuracy. Building upon the existing multi-branch architectures for high-speed semantic segmentation, we design a cheap high resolution branch for effective spatial detailing and a context branch with light-weight versions of global aggregation and local distribution blocks, potent to capture both long-range and local contextual dependencies required for accurate semantic segmentation, with low computational overheads. Specifically, we achieve 76.6% and 75.9% mIOU on Cityscapes validation and test sets respectively, at 76 FPS on an NVIDIA RTX 2080Ti and 8 FPS on a Jetson Xavier NX.
Original languageEnglish
Title of host publication2021 IEEE International Conference on Robotics and Automation (ICRA)
PublisherIEEE
Pages13517-13524
Number of pages8
ISBN (Electronic)978-1-7281-9077-8
ISBN (Print)978-1-7281-9078-5
DOIs
Publication statusPublished - 18 Oct 2021
EventIEEE International Conference on Robotics and Automation, ICRA 2021 - Xi'an, China, Virtual Event
Duration: 30 May 20215 Jun 2021

Conference

ConferenceIEEE International Conference on Robotics and Automation, ICRA 2021
Abbreviated titleICRA 2021
CityVirtual Event
Period30/05/215/06/21

Fingerprint

Dive into the research topics of 'CABiNet: Efficient Context Aggregation Network for Low-Latency Semantic Segmentation'. Together they form a unique fingerprint.

Cite this