On the expressivity of deep Heaviside networks

Research output: Working paperPreprintAcademic

22 Downloads (Pure)

Abstract

We show that deep Heaviside networks (DHNs) have limited expressiveness but that this can be overcome by including either skip connections or neurons with linear activation. We provide lower and upper bounds for the Vapnik-Chervonenkis (VC) dimensions and approximation rates of these network classes. As an application, we derive statistical convergence rates for DHN fits in the nonparametric regression model.
Original languageEnglish
PublisherArXiv.org
DOIs
Publication statusPublished - 30 Apr 2025

Keywords

  • stat.ML
  • cs.LG
  • cs.NA
  • math.NA

Fingerprint

Dive into the research topics of 'On the expressivity of deep Heaviside networks'. Together they form a unique fingerprint.

Cite this