Abstract
Artificial neural networks have proven to be capable of mastering many complex tasks. However, training such networks can be extremely resource intensive. In this research, the learning rule of a neural network trained using sparse evolutionary training (SET) is extended based on Hebbian theory. A mathematical formulation of Hebbian theory, encompassing inhibitory neurons and tailored for artificial neural networks, is proposed. The resulting novel algorithm, referred to as HebbSET, exhibits enhanced performance in terms of learning speed and final accuracy on two datasets. These findings underscore the potential of incorporating neuroscientific theories to enhance the capabilities of ANNs and bridge the gap between neuroscience and AI.
Original language | English |
---|---|
Title of host publication | BNAIC/BENELEARN 2023 |
Number of pages | 2 |
Publication status | Published - 2023 |
Event | Joint International Scientific Conferences on AI and Machine Learning, BNAIC/BeNeLearn 2023 - Delft University of Technology, Delft, Netherlands Duration: 8 Nov 2023 → 10 Nov 2023 https://bnaic2023.tudelft.nl/ |
Conference
Conference | Joint International Scientific Conferences on AI and Machine Learning, BNAIC/BeNeLearn 2023 |
---|---|
Abbreviated title | BNAIC/BeNeLearn 2023 |
Country/Territory | Netherlands |
City | Delft |
Period | 8/11/23 → 10/11/23 |
Internet address |
Keywords
- Sparse neural networks
- Hebbian learning
- Artificial intelligence
- Neuroscience