Posterior Contraction for Deep Gaussian Process Priors

Gianluca Finocchio, Anselm Johannes Schmidt-Hieber

Research output: Contribution to journalArticleAcademicpeer-review

15 Downloads (Pure)


We study posterior contraction rates for a class of deep Gaussian process priors in the
nonparametric regression setting under a general composition assumption on the regression function. It is shown that the contraction rates can achieve the minimax convergence rate (up to log n factors), while being adaptive to the underlying structure and smoothness of the target function. The proposed framework extends the Bayesian nonparametric theory for Gaussian process priors.
Original languageEnglish
Pages (from-to)1-49
Number of pages49
JournalJournal of machine learning research
Issue number66
Publication statusPublished - Feb 2023


  • Bayesian nonparametric regression
  • contraction rates
  • deep Gaussian processes
  • uncertainty quantification
  • neural networks


Dive into the research topics of 'Posterior Contraction for Deep Gaussian Process Priors'. Together they form a unique fingerprint.

Cite this