Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks

Zahra Atashgahi, Xuhao Zhang, Neil Kichler, Shiwei Liu, Lu Yin, Raymond Veldhuis, Mykola Pechenizkiy, Decebal Constantin Mocanu

Research output: Contribution to journalArticleAcademicpeer-review

5 Downloads (Pure)

Abstract

Feature selection that selects an informative subset of variables from data not only enhances the model interpretability and performance but also alleviates the resource demands. Recently, there has been growing attention on feature selection using neural networks. However, existing methods usually suffer from high computational costs when applied to high-dimensional datasets. In this paper, inspired by evolution processes, we propose a novel resource-efficient supervised feature selection method using sparse neural networks, named "NeuroFS". By gradually pruning the uninformative features from the input layer of a sparse neural network trained from scratch, NeuroFS derives an informative subset of features efficiently. By performing several experiments on
11 low and high-dimensional real-world benchmarks of different types, we demonstrate that NeuroFS achieves the highest ranking-based score among the considered state-of-the-art supervised feature selection models. The code will be available on GitHub.
Original languageEnglish
JournalTransactions on Machine Learning Research
Publication statusPublished - 4 Mar 2023

Fingerprint

Dive into the research topics of 'Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks'. Together they form a unique fingerprint.

Cite this