On improving deep learning generalization with adaptive sparse connectivity

Shiwei Liu, Decebal Constantin Mocanu, Mykola Pechenizkiy

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

Large neural networks are very successful in various tasks. However, with limited data, the generalization capabilities of deep neural networks are also very limited. In this paper, we empirically start showing that intrinsically sparse neural networks with adaptive sparse connectivity, which by design have a strict parameter budget during the training phase, have better generalization capabilities than their fully-connected counterparts. Besides this, we propose a new technique to train these sparse models by combining the Sparse Evolutionary Training (SET) procedure with neurons pruning. Operated on MultiLayer Perceptron (MLP) and tested on 15 datasets, our proposed technique zeros out around 50% of the hidden neurons during training, while having a linear number of parameters to optimize with respect to the number of neurons. The results show a competitive classification and generalization performance.
Original languageEnglish
Title of host publicationICML 2019 Workshop on Understanding and Improving General-ization in Deep Learning
Publication statusPublished - 14 Jun 2019
Externally publishedYes
Event36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States
Duration: 10 Jun 201915 Jun 2019
Conference number: 36

Conference

Conference36th International Conference on Machine Learning, ICML 2019
Abbreviated titleICML 2019
CountryUnited States
CityLong Beach
Period10/06/1915/06/19

Fingerprint Dive into the research topics of 'On improving deep learning generalization with adaptive sparse connectivity'. Together they form a unique fingerprint.

Cite this