Understanding dynamic sparse training capabilities in accommodating sparse data

Research output: Contribution to conferencePaperpeer-review

61 Downloads (Pure)

Abstract

Deep learning algorithms have became the state-of-the-art models for various tasks in a large area of applications. The most advanced deep learning models have many parameters, increasing costs, computational requirements, and memory footprints. Recently, the dynamic sparse training methods showed that it is possible to outperform the dense neural networks with sparse neural networks, while reducing the number of parameters (connections) quadratically. So far, all the proposed sparse training methods are tested on well-known benchmark datasets without data quality problems. However, in real-world data science applications, a lot of data quality challenges may appear, (e.g. missing data). Missing data can cause daunting challenges in determining the accuracy of models. Within this research, we intend to understand the interplay between dynamic sparse training methods and data sparsity for their mutual benefits.
Original languageEnglish
Pages1-6
Number of pages6
Publication statusPublished - 19 Sept 2022
EventEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML-PKDD 2022 - World Trade Center, Grenoble, France
Duration: 19 Sept 202223 Sept 2022
Conference number: 22
https://2022.ecmlpkdd.org/index.html
https://2022.ecmlpkdd.org/

Conference

ConferenceEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML-PKDD 2022
Abbreviated titleECML-PKDD2022
Country/TerritoryFrance
CityGrenoble
Period19/09/2223/09/22
Internet address

Keywords

  • Deep learning
  • Dynamic sparse training
  • Sparse Data

Fingerprint

Dive into the research topics of 'Understanding dynamic sparse training capabilities in accommodating sparse data'. Together they form a unique fingerprint.

Cite this