A brain-inspired algorithm for training highly sparse neural networks

Zahra Atashgahi*, Joost Pieterse, Shiwei Liu, Decebal Constantin Mocanu, Raymond N.J. Veldhuis, Mykola Pechenizkiy

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

9 Citations (Scopus)
58 Downloads (Pure)

Abstract

Sparse neural networks attract increasing interest as they exhibit comparable performance to their dense counterparts while being computationally efficient. Pruning the dense neural networks is among the most widely used methods to obtain a sparse neural network. Driven by the high training cost of such methods that can be unaffordable for a low-resource device, training sparse neural networks sparsely from scratch has recently gained attention. However, existing sparse training algorithms suffer from various issues, including poor performance in high sparsity scenarios, computing dense gradient information during training, or pure random topology search. In this paper, inspired by the evolution of the biological brain and the Hebbian learning theory, we present a new sparse training approach that evolves sparse neural networks according to the behavior of neurons in the network. Concretely, by exploiting the cosine similarity metric to measure the importance of the connections, our proposed method, “Cosine similarity-based and random topology exploration (CTRE)”, evolves the topology of sparse neural networks by adding the most important connections to the network without calculating dense gradient in the backward. We carried out different experiments on eight datasets, including tabular, image, and text datasets, and demonstrate that our proposed method outperforms several state-of-the-art sparse training algorithms in extremely sparse neural networks by a large gap. The implementation code is available on Github.
Original languageEnglish
Pages (from-to)4411-4452
Number of pages42
JournalMachine Learning
Volume111
Issue number12
Early online date8 Nov 2022
DOIs
Publication statusPublished - Dec 2022

Keywords

  • Deep Learning
  • Sparse neural networks
  • Sparse Training
  • UT-Hybrid-D

Fingerprint

Dive into the research topics of 'A brain-inspired algorithm for training highly sparse neural networks'. Together they form a unique fingerprint.

Cite this