Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders (poster)

Zahra Atashgahi*, Ghada A.Z.N. Sokar, Tim van der Lee, Elena Mocanu, Decebal Constantin Mocanu, Raymond N.J. Veldhuis, Mykola Pechenizkiy

*Corresponding author for this work

Research output: Contribution to conferencePosterAcademic

44 Downloads (Pure)

Abstract

Major complications arise from the recent increase in the amount of high-dimensional data, including high computational costs and memory requirements. Feature selection, which identifies the most relevant and informative attributes of a dataset, has been introduced as a solution to this problem. Most of the existing feature selection methods are computationally inefficient; inefficient algorithms lead to high energy consumption, which is not desirable for devices with limited computational and energy resources. We present a novel feature selection method, named QuickSelection, which introduces the strength of the neuron in sparse neural networks as a criterion to measure the feature importance . This criterion, blended with sparsely connected denoising autoencoders trained with the sparse evolutionary training procedure, derives the importance of all input features simultaneously. The corresponding paper is available online on arxiv.
Original languageEnglish
Publication statusPublished - Jul 2021
EventSparsity in Neural Networks: Advancing Understanding and Practice 2021 - Online, Online
Duration: 8 Jul 20219 Jul 2021
Conference number: 1
https://sites.google.com/view/sparsity-workshop-2021/

Workshop

WorkshopSparsity in Neural Networks: Advancing Understanding and Practice 2021
Abbreviated titleSNN Workshop 2021
CityOnline
Period8/07/219/07/21
Internet address

Keywords

  • Feature Selection

Fingerprint

Dive into the research topics of 'Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders (poster)'. Together they form a unique fingerprint.

Cite this