We study the power of deep neural networks (DNNs) with sigmoid activation function.Recently, it was shown that DNNs approximate anyd-dimensional, smooth function ona compact set with a rate of orderW−p/d, whereWis the number of nonzero weightsin the network andpis the smoothness of the function. Unfortunately, these rates onlyhold for a special class of sparsely connected DNNs. We ask ourselves if we can showthe same approximation rate for a simpler and more general class, i.e., DNNs whichare only defined by its width and depth. In this article we show that DNNs with fixeddepth and a width of orderMdachieve an approximation rate ofM−2p. As a conclusionwe quantitatively characterize the approximation power of DNNs in terms of the overallweightsW0inthenetworkandshowanapproximationrateofW−p/d0.Thismoregeneralresult finally helps us to understand which network topology guarantees a special targetaccuracy.