Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function

Sophie Langer

Research output: Contribution to journalArticleAcademicpeer-review

5 Citations (Scopus)

Abstract

We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set with a rate of order W−p/d, where W is the number of nonzero weights in the network and p is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order Md achieve an approximation rate of M−2p. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W0in the network and show an approximation rate of W−p/d0. This more generalresult finally helps us to understand which network topology guarantees a special target accuracy.
Original languageEnglish
Article number104695
Number of pages14
JournalJournal of multivariate analysis
Volume182
Issue numberC
DOIs
Publication statusPublished - 2021
Externally publishedYes

Fingerprint

Dive into the research topics of 'Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function'. Together they form a unique fingerprint.

Cite this