Visual response inhibition for increased robustness of convolutional networks to distribution shifts

Nicola Strisciuglio, George Azzopardi

Research output: Contribution to conferencePaperpeer-review

30 Downloads (Pure)

Abstract

Convolutional neural networks have been shown to suffer from distribution shifts in the test data, for instance caused by the so called common corruptions and perturbations. Test images can contain noise, digital transformations, and blur that were not present in the training data, negatively impacting the performance of trained models. Humans experience much stronger robustness to noise and visual distortions than deep networks. In this work, we explore the effectiveness of a neuronal response inhibition mechanism, called push-pull, observed in the early part of the visual system, to increase the robustness of deep convolutional networks. We deploy a Push-Pull inhibition layer as a replacement of the initial convolutional layers (input layer and in the first block of residual and dense architectures) of standard convolutional networks for image classification. We show that the PushPull inhibition component increases the robustness of standard networks for image classification to distribution shifts on the CIFAR10-C and CIFAR10-P test sets.
Original languageEnglish
Number of pages9
Publication statusPublished - 5 Oct 2022
Event36th Annual Conference on Neural Information Processing Systems, NeurIPS 2022: Connecting Methods and Applications - New Orleans Convention Center, New Orleans, United States
Duration: 28 Nov 20229 Dec 2022
Conference number: 36
https://neurips.cc/Conferences/2022

Conference

Conference36th Annual Conference on Neural Information Processing Systems, NeurIPS 2022
Abbreviated titleNeurIPS 2022
Country/TerritoryUnited States
CityNew Orleans
Period28/11/229/12/22
Internet address

Fingerprint

Dive into the research topics of 'Visual response inhibition for increased robustness of convolutional networks to distribution shifts'. Together they form a unique fingerprint.

Cite this