A brief note on understanding neural networks as Gaussian processes

Research output: Working paper

31 Downloads (Pure)


As a generalization of the work in [Lee et al., 2017], this note briefly discusses when the prior of a neural network output follows a Gaussian process, and how a neural-network-induced Gaussian process is formulated. The posterior mean functions of such a Gaussian process regression lie in the reproducing kernel Hilbert space defined by the neural-network-induced kernel. In the case of two-layer neural networks, the induced Gaussian processes provide an interpretation of the reproducing kernel Hilbert spaces whose union forms a Barron space.
Original languageEnglish
Publication statusPublished - 25 Jul 2021


  • cs.LG
  • cs.CE
  • stat.ML

Cite this