Enhancing Learning in Sparse Neural Networks: A Hebbian Learning Approach

Alexander de Ranitz, Ardion D. Beldad, Elena Mocanu

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

119 Downloads (Pure)

Abstract

Artificial neural networks have proven to be capable of mastering many complex tasks. However, training such networks can be extremely resource intensive. In this research, the learning rule of a neural network trained using sparse evolutionary training (SET) is extended based on Hebbian theory. A mathematical formulation of Hebbian theory, encompassing inhibitory neurons and tailored for artificial neural networks, is proposed. The resulting novel algorithm, referred to as HebbSET, exhibits enhanced performance in terms of learning speed and final accuracy on two datasets. These findings underscore the potential of incorporating neuroscientific theories to enhance the capabilities of ANNs and bridge the gap between neuroscience and AI.
Original languageEnglish
Title of host publicationBNAIC/BENELEARN 2023
Number of pages2
Publication statusPublished - 2023
EventJoint International Scientific Conferences on AI and Machine Learning, BNAIC/BeNeLearn 2023 - Delft University of Technology, Delft, Netherlands
Duration: 8 Nov 202310 Nov 2023
https://bnaic2023.tudelft.nl/

Conference

ConferenceJoint International Scientific Conferences on AI and Machine Learning, BNAIC/BeNeLearn 2023
Abbreviated titleBNAIC/BeNeLearn 2023
Country/TerritoryNetherlands
CityDelft
Period8/11/2310/11/23
Internet address

Keywords

  • Sparse neural networks
  • Hebbian learning
  • Artificial intelligence
  • Neuroscience

Fingerprint

Dive into the research topics of 'Enhancing Learning in Sparse Neural Networks: A Hebbian Learning Approach'. Together they form a unique fingerprint.

Cite this