Approximating smooth functions by deep neural networks with sigmoid activation function

Sophie Langer

Research output: Contribution to journalArticleAcademicpeer-review

6 Citations (Scopus)


We study the power of deep neural networks (DNNs) with sigmoid activation function.Recently, it was shown that DNNs approximate anyd-dimensional, smooth function ona compact set with a rate of orderW−p/d, whereWis the number of nonzero weightsin the network andpis the smoothness of the function. Unfortunately, these rates onlyhold for a special class of sparsely connected DNNs. We ask ourselves if we can showthe same approximation rate for a simpler and more general class, i.e., DNNs whichare only defined by its width and depth. In this article we show that DNNs with fixeddepth and a width of orderMdachieve an approximation rate ofM−2p. As a conclusionwe quantitatively characterize the approximation power of DNNs in terms of the overallweightsW0inthenetworkandshowanapproximationrateofW−p/d0.Thismoregeneralresult finally helps us to understand which network topology guarantees a special targetaccuracy.
Original languageEnglish
Article number104696
Number of pages21
JournalJournal of multivariate analysis
Issue numberC
Early online date10 Nov 2020
Publication statusPublished - 2021
Externally publishedYes


Dive into the research topics of 'Approximating smooth functions by deep neural networks with sigmoid activation function'. Together they form a unique fingerprint.

Cite this