Abstract
Deep learning algorithms have became the state-of-the-art models for various tasks in a large area of applications. The most advanced deep learning models have many parameters, increasing costs, computational requirements, and memory footprints. Recently, the dynamic sparse training methods showed that it is possible to outperform the dense neural networks with sparse neural networks, while reducing the number of parameters (connections) quadratically. So far, all the proposed sparse training methods are tested on well-known benchmark datasets without data quality problems. However, in real-world data science applications, a lot of data quality challenges may appear, (e.g. missing data). Missing data can cause daunting challenges in determining the accuracy of models. Within this research, we intend to understand the interplay between dynamic sparse training methods and data sparsity for their mutual benefits.
Original language | English |
---|---|
Pages | 1-6 |
Number of pages | 6 |
Publication status | Published - 19 Sept 2022 |
Event | European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML-PKDD 2022 - World Trade Center, Grenoble, France Duration: 19 Sept 2022 → 23 Sept 2022 Conference number: 22 https://2022.ecmlpkdd.org/index.html https://2022.ecmlpkdd.org/ |
Conference
Conference | European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML-PKDD 2022 |
---|---|
Abbreviated title | ECML-PKDD2022 |
Country/Territory | France |
City | Grenoble |
Period | 19/09/22 → 23/09/22 |
Internet address |
Keywords
- Deep learning
- Dynamic sparse training
- Sparse Data