A brief note on understanding neural networks as Gaussian processes

Research output: Working paper

1 Downloads (Pure)

Abstract

As a generalization of the work in [Lee et al., 2017], this note briefly discusses when the prior of a neural network output follows a Gaussian process, and how a neural-network-induced Gaussian process is formulated. The posterior mean functions of such a Gaussian process regression lie in the reproducing kernel Hilbert space defined by the neural-network-induced kernel. In the case of two-layer neural networks, the induced Gaussian processes provide an interpretation of the reproducing kernel Hilbert spaces whose union forms a Barron space.
Original languageEnglish
PublisherarXiv.org
Publication statusPublished - 25 Jul 2021

Keywords

  • cs.LG
  • cs.CE
  • stat.ML

Fingerprint

Dive into the research topics of 'A brief note on understanding neural networks as Gaussian processes'. Together they form a unique fingerprint.

Cite this