Posterior contraction for deep Gaussian process priors

Gianluca Finocchio, Anselm Johannes Schmidt-Hieber

Research output: Working paper

92 Downloads (Pure)


We study posterior contraction rates for a class of deep Gaussian process priors applied to the nonparametric regression problem under a general composition assumption on the regression function. It is shown that the contraction rates can achieve the minimax convergence rate (up to $\log n$ factors), while being adaptive to the underlying structure and smoothness of the target function. The proposed framework extends the Bayesian nonparametrics theory for Gaussian process priors.
Original languageEnglish
Number of pages48
Publication statusPublished - 22 May 2021


  • Bayesian inference
  • Nonparametric regression
  • contraction rates
  • Uncertainty quantification
  • Neural Networks
  • deep Gaussian processes


Dive into the research topics of 'Posterior contraction for deep Gaussian process priors'. Together they form a unique fingerprint.

Cite this