Abstract
We study posterior contraction rates for a class of deep Gaussian process priors applied to the nonparametric regression problem under a general composition assumption on the regression function. It is shown that the contraction rates can achieve the minimax convergence rate (up to $\log n$ factors), while being adaptive to the underlying structure and smoothness of the target function. The proposed framework extends the Bayesian nonparametrics theory for Gaussian process priors.
Original language | English |
---|---|
Number of pages | 48 |
Publication status | Published - 22 May 2021 |
Keywords
- Bayesian inference
- Nonparametric regression
- contraction rates
- Uncertainty quantification
- Neural Networks
- deep Gaussian processes