Research output per year
Research output per year
Sophie Langer*
Research output: Contribution to journal › Article › Academic › peer-review
We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set with a rate of order W −p∕d, where W is the number of nonzero weights in the network and p is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order M d achieve an approximation rate of M −2p. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W 0 in the network and show an approximation rate of W 0 −p∕d. This more general result finally helps us to understand which network topology guarantees a special target accuracy.
Original language | English |
---|---|
Article number | 104696 |
Number of pages | 21 |
Journal | Journal of multivariate analysis |
Volume | 182 |
Issue number | C |
Early online date | 10 Nov 2020 |
DOIs | |
Publication status | Published - Mar 2021 |
Externally published | Yes |
Research output: Working paper › Preprint › Academic